Jan 12 13:02:13 localhost kernel: Linux version 5.14.0-655.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Mon Dec 29 08:24:22 UTC 2025
Jan 12 13:02:13 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 12 13:02:13 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-655.el9.x86_64 root=UUID=f2a0a5c1-133f-4977-b837-e40b31cbd9cc ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 12 13:02:13 localhost kernel: BIOS-provided physical RAM map:
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdafff] usable
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x000000007ffdb000-0x000000007fffffff] reserved
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 12 13:02:13 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000027fffffff] usable
Jan 12 13:02:13 localhost kernel: NX (Execute Disable) protection: active
Jan 12 13:02:13 localhost kernel: APIC: Static calls initialized
Jan 12 13:02:13 localhost kernel: SMBIOS 2.8 present.
Jan 12 13:02:13 localhost kernel: DMI: Red Hat OpenStack Compute/RHEL, BIOS 1.16.1-1.el9 04/01/2014
Jan 12 13:02:13 localhost kernel: Hypervisor detected: KVM
Jan 12 13:02:13 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 12 13:02:13 localhost kernel: kvm-clock: using sched offset of 3277431129 cycles
Jan 12 13:02:13 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 12 13:02:13 localhost kernel: tsc: Detected 2445.406 MHz processor
Jan 12 13:02:13 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 12 13:02:13 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 12 13:02:13 localhost kernel: last_pfn = 0x280000 max_arch_pfn = 0x400000000
Jan 12 13:02:13 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 12 13:02:13 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 12 13:02:13 localhost kernel: last_pfn = 0x7ffdb max_arch_pfn = 0x400000000
Jan 12 13:02:13 localhost kernel: found SMP MP-table at [mem 0x000f5b60-0x000f5b6f]
Jan 12 13:02:13 localhost kernel: Using GB pages for direct mapping
Jan 12 13:02:13 localhost kernel: RAMDISK: [mem 0x2d461000-0x32a28fff]
Jan 12 13:02:13 localhost kernel: ACPI: Early table checksum verification disabled
Jan 12 13:02:13 localhost kernel: ACPI: RSDP 0x00000000000F5B20 000014 (v00 BOCHS )
Jan 12 13:02:13 localhost kernel: ACPI: RSDT 0x000000007FFE35EB 000034 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 12 13:02:13 localhost kernel: ACPI: FACP 0x000000007FFE3403 0000F4 (v03 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 12 13:02:13 localhost kernel: ACPI: DSDT 0x000000007FFDFCC0 003743 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 12 13:02:13 localhost kernel: ACPI: FACS 0x000000007FFDFC80 000040
Jan 12 13:02:13 localhost kernel: ACPI: APIC 0x000000007FFE34F7 000090 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 12 13:02:13 localhost kernel: ACPI: MCFG 0x000000007FFE3587 00003C (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 12 13:02:13 localhost kernel: ACPI: WAET 0x000000007FFE35C3 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 12 13:02:13 localhost kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe3403-0x7ffe34f6]
Jan 12 13:02:13 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfcc0-0x7ffe3402]
Jan 12 13:02:13 localhost kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfc80-0x7ffdfcbf]
Jan 12 13:02:13 localhost kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe34f7-0x7ffe3586]
Jan 12 13:02:13 localhost kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe3587-0x7ffe35c2]
Jan 12 13:02:13 localhost kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe35c3-0x7ffe35ea]
Jan 12 13:02:13 localhost kernel: No NUMA configuration found
Jan 12 13:02:13 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000027fffffff]
Jan 12 13:02:13 localhost kernel: NODE_DATA(0) allocated [mem 0x27ffd5000-0x27fffffff]
Jan 12 13:02:13 localhost kernel: crashkernel reserved: 0x000000006f000000 - 0x000000007f000000 (256 MB)
Jan 12 13:02:13 localhost kernel: Zone ranges:
Jan 12 13:02:13 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 12 13:02:13 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 12 13:02:13 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000027fffffff]
Jan 12 13:02:13 localhost kernel:   Device   empty
Jan 12 13:02:13 localhost kernel: Movable zone start for each node
Jan 12 13:02:13 localhost kernel: Early memory node ranges
Jan 12 13:02:13 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 12 13:02:13 localhost kernel:   node   0: [mem 0x0000000000100000-0x000000007ffdafff]
Jan 12 13:02:13 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000027fffffff]
Jan 12 13:02:13 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000027fffffff]
Jan 12 13:02:13 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 12 13:02:13 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 12 13:02:13 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 12 13:02:13 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 12 13:02:13 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 12 13:02:13 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 12 13:02:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 12 13:02:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 12 13:02:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 12 13:02:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 12 13:02:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 12 13:02:13 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 12 13:02:13 localhost kernel: TSC deadline timer available
Jan 12 13:02:13 localhost kernel: CPU topo: Max. logical packages:   4
Jan 12 13:02:13 localhost kernel: CPU topo: Max. logical dies:       4
Jan 12 13:02:13 localhost kernel: CPU topo: Max. dies per package:   1
Jan 12 13:02:13 localhost kernel: CPU topo: Max. threads per core:   1
Jan 12 13:02:13 localhost kernel: CPU topo: Num. cores per package:     1
Jan 12 13:02:13 localhost kernel: CPU topo: Num. threads per package:   1
Jan 12 13:02:13 localhost kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs
Jan 12 13:02:13 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 12 13:02:13 localhost kernel: kvm-guest: KVM setup pv remote TLB flush
Jan 12 13:02:13 localhost kernel: kvm-guest: setup PV sched yield
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x7ffdb000-0x7fffffff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x80000000-0xafffffff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xb0000000-0xbfffffff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfed1bfff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfed1c000-0xfed1ffff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfed20000-0xfeffbfff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 12 13:02:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 12 13:02:13 localhost kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices
Jan 12 13:02:13 localhost kernel: Booting paravirtualized kernel on KVM
Jan 12 13:02:13 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 12 13:02:13 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1
Jan 12 13:02:13 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u524288
Jan 12 13:02:13 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u524288 alloc=1*2097152
Jan 12 13:02:13 localhost kernel: pcpu-alloc: [0] 0 1 2 3 
Jan 12 13:02:13 localhost kernel: kvm-guest: PV spinlocks enabled
Jan 12 13:02:13 localhost kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear)
Jan 12 13:02:13 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-655.el9.x86_64 root=UUID=f2a0a5c1-133f-4977-b837-e40b31cbd9cc ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 12 13:02:13 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-655.el9.x86_64", will be passed to user space.
Jan 12 13:02:13 localhost kernel: random: crng init done
Jan 12 13:02:13 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 12 13:02:13 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 12 13:02:13 localhost kernel: Fallback order for Node 0: 0 
Jan 12 13:02:13 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 12 13:02:13 localhost kernel: Policy zone: Normal
Jan 12 13:02:13 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 12 13:02:13 localhost kernel: software IO TLB: area num 4.
Jan 12 13:02:13 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1
Jan 12 13:02:13 localhost kernel: ftrace: allocating 49414 entries in 194 pages
Jan 12 13:02:13 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 12 13:02:13 localhost kernel: Dynamic Preempt: voluntary
Jan 12 13:02:13 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 12 13:02:13 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 12 13:02:13 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=4.
Jan 12 13:02:13 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 12 13:02:13 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 12 13:02:13 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 12 13:02:13 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 12 13:02:13 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4
Jan 12 13:02:13 localhost kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Jan 12 13:02:13 localhost kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Jan 12 13:02:13 localhost kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Jan 12 13:02:13 localhost kernel: NR_IRQS: 524544, nr_irqs: 456, preallocated irqs: 16
Jan 12 13:02:13 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 12 13:02:13 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 12 13:02:13 localhost kernel: Console: colour VGA+ 80x25
Jan 12 13:02:13 localhost kernel: printk: console [ttyS0] enabled
Jan 12 13:02:13 localhost kernel: ACPI: Core revision 20230331
Jan 12 13:02:13 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 12 13:02:13 localhost kernel: x2apic enabled
Jan 12 13:02:13 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 12 13:02:13 localhost kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask()
Jan 12 13:02:13 localhost kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself()
Jan 12 13:02:13 localhost kernel: kvm-guest: setup PV IPIs
Jan 12 13:02:13 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 12 13:02:13 localhost kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406)
Jan 12 13:02:13 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 12 13:02:13 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 12 13:02:13 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 12 13:02:13 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 12 13:02:13 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 12 13:02:13 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 12 13:02:13 localhost kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls
Jan 12 13:02:13 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 12 13:02:13 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 12 13:02:13 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 12 13:02:13 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 12 13:02:13 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 12 13:02:13 localhost kernel: Transient Scheduler Attacks: Vulnerable: No microcode
Jan 12 13:02:13 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 12 13:02:13 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 12 13:02:13 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 12 13:02:13 localhost kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers'
Jan 12 13:02:13 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 12 13:02:13 localhost kernel: x86/fpu: xstate_offset[9]:  832, xstate_sizes[9]:    8
Jan 12 13:02:13 localhost kernel: x86/fpu: Enabled xstate features 0x207, context size is 840 bytes, using 'compacted' format.
Jan 12 13:02:13 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 12 13:02:13 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 12 13:02:13 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 12 13:02:13 localhost kernel: landlock: Up and running.
Jan 12 13:02:13 localhost kernel: Yama: becoming mindful.
Jan 12 13:02:13 localhost kernel: SELinux:  Initializing.
Jan 12 13:02:13 localhost kernel: LSM support for eBPF active
Jan 12 13:02:13 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 12 13:02:13 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 12 13:02:13 localhost kernel: smpboot: CPU0: AMD EPYC 7763 64-Core Processor (family: 0x19, model: 0x1, stepping: 0x1)
Jan 12 13:02:13 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 12 13:02:13 localhost kernel: ... version:                0
Jan 12 13:02:13 localhost kernel: ... bit width:              48
Jan 12 13:02:13 localhost kernel: ... generic registers:      6
Jan 12 13:02:13 localhost kernel: ... value mask:             0000ffffffffffff
Jan 12 13:02:13 localhost kernel: ... max period:             00007fffffffffff
Jan 12 13:02:13 localhost kernel: ... fixed-purpose events:   0
Jan 12 13:02:13 localhost kernel: ... event mask:             000000000000003f
Jan 12 13:02:13 localhost kernel: signal: max sigframe size: 3376
Jan 12 13:02:13 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 12 13:02:13 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 12 13:02:13 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 12 13:02:13 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 12 13:02:13 localhost kernel: .... node  #0, CPUs:      #1 #2 #3
Jan 12 13:02:13 localhost kernel: smp: Brought up 1 node, 4 CPUs
Jan 12 13:02:13 localhost kernel: smpboot: Total of 4 processors activated (19563.24 BogoMIPS)
Jan 12 13:02:13 localhost kernel: node 0 deferred pages initialised in 7ms
Jan 12 13:02:13 localhost kernel: Memory: 7766384K/8388068K available (16384K kernel code, 5796K rwdata, 13908K rodata, 4196K init, 7200K bss, 617192K reserved, 0K cma-reserved)
Jan 12 13:02:13 localhost kernel: devtmpfs: initialized
Jan 12 13:02:13 localhost kernel: x86/mm: Memory block size: 128MB
Jan 12 13:02:13 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 12 13:02:13 localhost kernel: futex hash table entries: 1024 (65536 bytes on 1 NUMA nodes, total 64 KiB, linear).
Jan 12 13:02:13 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 12 13:02:13 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 12 13:02:13 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 12 13:02:13 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 12 13:02:13 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 12 13:02:13 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 12 13:02:13 localhost kernel: audit: type=2000 audit(1768222933.651:1): state=initialized audit_enabled=0 res=1
Jan 12 13:02:13 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 12 13:02:13 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 12 13:02:13 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 12 13:02:13 localhost kernel: cpuidle: using governor menu
Jan 12 13:02:13 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 12 13:02:13 localhost kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff]
Jan 12 13:02:13 localhost kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry
Jan 12 13:02:13 localhost kernel: PCI: Using configuration type 1 for base access
Jan 12 13:02:13 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 12 13:02:13 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 12 13:02:13 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 12 13:02:13 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 12 13:02:13 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 12 13:02:13 localhost kernel: Demotion targets for Node 0: null
Jan 12 13:02:13 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 12 13:02:13 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 12 13:02:13 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 12 13:02:13 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 12 13:02:13 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 12 13:02:13 localhost kernel: ACPI: Interpreter enabled
Jan 12 13:02:13 localhost kernel: ACPI: PM: (supports S0 S5)
Jan 12 13:02:13 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 12 13:02:13 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 12 13:02:13 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 12 13:02:13 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 3F
Jan 12 13:02:13 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 12 13:02:13 localhost kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 12 13:02:13 localhost kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR DPC]
Jan 12 13:02:13 localhost kernel: acpi PNP0A08:00: _OSC: OS now controls [SHPCHotplug PME AER PCIeCapability]
Jan 12 13:02:13 localhost kernel: PCI host bridge to bus 0000:00
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x280000000-0xa7fffffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:00:01.0: BAR 0 [mem 0xf9800000-0xf9ffffff pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:01.0: BAR 2 [mem 0xfc200000-0xfc203fff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1: BAR 0 [mem 0xfea1a000-0xfea1afff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2: BAR 0 [mem 0xfea1b000-0xfea1bfff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3: BAR 0 [mem 0xfea1c000-0xfea1cfff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4: BAR 0 [mem 0xfea1d000-0xfea1dfff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5: BAR 0 [mem 0xfea1e000-0xfea1efff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6: BAR 0 [mem 0xfea1f000-0xfea1ffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7: BAR 0 [mem 0xfea20000-0xfea20fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0: BAR 0 [mem 0xfea21000-0xfea21fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:00:1f.0: quirk: [io  0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO
Jan 12 13:02:13 localhost kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:00:1f.2: BAR 4 [io  0xd040-0xd05f]
Jan 12 13:02:13 localhost kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea22000-0xfea22fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:00:1f.3: BAR 4 [io  0x0700-0x073f]
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0: BAR 0 [mem 0xfc800000-0xfc8000ff 64bit]
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:02: extended config space not accessible
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [1] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [2] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [3] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [4] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [5] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [6] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [7] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [8] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [9] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [10] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [11] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [12] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [13] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [14] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [15] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [16] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [17] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [18] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [19] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [20] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [21] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [22] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [23] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [24] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [25] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [26] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [27] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [28] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [29] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [30] registered
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [31] registered
Jan 12 13:02:13 localhost kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:02:01.0: BAR 4 [io  0xc000-0xc01f]
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-2] registered
Jan 12 13:02:13 localhost kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe840000-0xfe840fff]
Jan 12 13:02:13 localhost kernel: pci 0000:03:00.0: BAR 4 [mem 0xfbe00000-0xfbe03fff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:03:00.0: ROM [mem 0xfe800000-0xfe83ffff pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-3] registered
Jan 12 13:02:13 localhost kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:04:00.0: BAR 1 [mem 0xfe600000-0xfe600fff]
Jan 12 13:02:13 localhost kernel: pci 0000:04:00.0: BAR 4 [mem 0xfbc00000-0xfbc03fff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-4] registered
Jan 12 13:02:13 localhost kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:05:00.0: BAR 4 [mem 0xfba00000-0xfba03fff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-5] registered
Jan 12 13:02:13 localhost kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint
Jan 12 13:02:13 localhost kernel: pci 0000:06:00.0: BAR 4 [mem 0xfb800000-0xfb803fff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-6] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-7] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-8] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-9] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-10] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-11] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-12] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-13] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-14] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-15] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-16] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Jan 12 13:02:13 localhost kernel: acpiphp: Slot [0-17] registered
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22
Jan 12 13:02:13 localhost kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23
Jan 12 13:02:13 localhost kernel: iommu: Default domain type: Translated
Jan 12 13:02:13 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 12 13:02:13 localhost kernel: SCSI subsystem initialized
Jan 12 13:02:13 localhost kernel: ACPI: bus type USB registered
Jan 12 13:02:13 localhost kernel: usbcore: registered new interface driver usbfs
Jan 12 13:02:13 localhost kernel: usbcore: registered new interface driver hub
Jan 12 13:02:13 localhost kernel: usbcore: registered new device driver usb
Jan 12 13:02:13 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 12 13:02:13 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 12 13:02:13 localhost kernel: PTP clock support registered
Jan 12 13:02:13 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 12 13:02:13 localhost kernel: NetLabel: Initializing
Jan 12 13:02:13 localhost kernel: NetLabel:  domain hash size = 128
Jan 12 13:02:13 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 12 13:02:13 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 12 13:02:13 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 12 13:02:13 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 12 13:02:13 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 12 13:02:13 localhost kernel: e820: reserve RAM buffer [mem 0x7ffdb000-0x7fffffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device
Jan 12 13:02:13 localhost kernel: pci 0000:00:01.0: vgaarb: bridge control possible
Jan 12 13:02:13 localhost kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 12 13:02:13 localhost kernel: vgaarb: loaded
Jan 12 13:02:13 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 12 13:02:13 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 12 13:02:13 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 12 13:02:13 localhost kernel: pnp: PnP ACPI init
Jan 12 13:02:13 localhost kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved
Jan 12 13:02:13 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 12 13:02:13 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 12 13:02:13 localhost kernel: NET: Registered PF_INET protocol family
Jan 12 13:02:13 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 12 13:02:13 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 12 13:02:13 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 12 13:02:13 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 12 13:02:13 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 12 13:02:13 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 12 13:02:13 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 12 13:02:13 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 12 13:02:13 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 12 13:02:13 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 12 13:02:13 localhost kernel: NET: Registered PF_XDP protocol family
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x0fff] to [bus 03] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2: bridge window [io  0x1000-0x0fff] to [bus 04] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3: bridge window [io  0x1000-0x0fff] to [bus 05] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4: bridge window [io  0x1000-0x0fff] to [bus 06] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5: bridge window [io  0x1000-0x0fff] to [bus 07] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6: bridge window [io  0x1000-0x0fff] to [bus 08] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7: bridge window [io  0x1000-0x0fff] to [bus 09] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0: bridge window [io  0x1000-0x0fff] to [bus 0a] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1: bridge window [io  0x1000-0x0fff] to [bus 0b] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2: bridge window [io  0x1000-0x0fff] to [bus 0c] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3: bridge window [io  0x1000-0x0fff] to [bus 0d] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4: bridge window [io  0x1000-0x0fff] to [bus 0e] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5: bridge window [io  0x1000-0x0fff] to [bus 0f] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6: bridge window [io  0x1000-0x0fff] to [bus 10] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7: bridge window [io  0x1000-0x0fff] to [bus 11] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x0fff] to [bus 12] add_size 1000
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x1fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2: bridge window [io  0x2000-0x2fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3: bridge window [io  0x3000-0x3fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4: bridge window [io  0x4000-0x4fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5: bridge window [io  0x5000-0x5fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6: bridge window [io  0x6000-0x6fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7: bridge window [io  0x7000-0x7fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0: bridge window [io  0x8000-0x8fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1: bridge window [io  0x9000-0x9fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2: bridge window [io  0xa000-0xafff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3: bridge window [io  0xb000-0xbfff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4: bridge window [io  0xe000-0xefff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5: bridge window [io  0xf000-0xffff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: can't assign; no space
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: failed to assign
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: can't assign; no space
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: failed to assign
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: can't assign; no space
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: failed to assign
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x1fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7: bridge window [io  0x2000-0x2fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6: bridge window [io  0x3000-0x3fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5: bridge window [io  0x4000-0x4fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4: bridge window [io  0x5000-0x5fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3: bridge window [io  0x6000-0x6fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2: bridge window [io  0x7000-0x7fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1: bridge window [io  0x8000-0x8fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0: bridge window [io  0x9000-0x9fff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7: bridge window [io  0xa000-0xafff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6: bridge window [io  0xb000-0xbfff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5: bridge window [io  0xe000-0xefff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4: bridge window [io  0xf000-0xffff]: assigned
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: can't assign; no space
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: failed to assign
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: can't assign; no space
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: failed to assign
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: can't assign; no space
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: failed to assign
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4:   bridge window [io  0xf000-0xffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5:   bridge window [io  0xe000-0xefff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6:   bridge window [io  0xb000-0xbfff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7:   bridge window [io  0xa000-0xafff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0:   bridge window [io  0x9000-0x9fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1:   bridge window [io  0x8000-0x8fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2:   bridge window [io  0x7000-0x7fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3:   bridge window [io  0x6000-0x6fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4:   bridge window [io  0x5000-0x5fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5:   bridge window [io  0x4000-0x4fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6:   bridge window [io  0x3000-0x3fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7:   bridge window [io  0x2000-0x2fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0:   bridge window [io  0x1000-0x1fff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Jan 12 13:02:13 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:00: resource 9 [mem 0x280000000-0xa7fffffff window]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:01: resource 0 [io  0xc000-0xcfff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:01: resource 1 [mem 0xfc600000-0xfc9fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:01: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:02: resource 0 [io  0xc000-0xcfff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:02: resource 1 [mem 0xfc600000-0xfc7fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:02: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:03: resource 2 [mem 0xfbe00000-0xfbffffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:04: resource 2 [mem 0xfbc00000-0xfbdfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:05: resource 2 [mem 0xfba00000-0xfbbfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:06: resource 0 [io  0xf000-0xffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:06: resource 2 [mem 0xfb800000-0xfb9fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:07: resource 0 [io  0xe000-0xefff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:07: resource 2 [mem 0xfb600000-0xfb7fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:08: resource 0 [io  0xb000-0xbfff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:08: resource 2 [mem 0xfb400000-0xfb5fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:09: resource 0 [io  0xa000-0xafff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:09: resource 2 [mem 0xfb200000-0xfb3fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0a: resource 0 [io  0x9000-0x9fff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0a: resource 1 [mem 0xfda00000-0xfdbfffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0a: resource 2 [mem 0xfb000000-0xfb1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0b: resource 0 [io  0x8000-0x8fff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0b: resource 1 [mem 0xfd800000-0xfd9fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0b: resource 2 [mem 0xfae00000-0xfaffffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0c: resource 0 [io  0x7000-0x7fff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0c: resource 1 [mem 0xfd600000-0xfd7fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0c: resource 2 [mem 0xfac00000-0xfadfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0d: resource 0 [io  0x6000-0x6fff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0d: resource 1 [mem 0xfd400000-0xfd5fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0d: resource 2 [mem 0xfaa00000-0xfabfffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0e: resource 0 [io  0x5000-0x5fff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0e: resource 1 [mem 0xfd200000-0xfd3fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0e: resource 2 [mem 0xfa800000-0xfa9fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0f: resource 0 [io  0x4000-0x4fff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0f: resource 1 [mem 0xfd000000-0xfd1fffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:0f: resource 2 [mem 0xfa600000-0xfa7fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:10: resource 0 [io  0x3000-0x3fff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:10: resource 1 [mem 0xfce00000-0xfcffffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:10: resource 2 [mem 0xfa400000-0xfa5fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:11: resource 0 [io  0x2000-0x2fff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:11: resource 1 [mem 0xfcc00000-0xfcdfffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:11: resource 2 [mem 0xfa200000-0xfa3fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:12: resource 0 [io  0x1000-0x1fff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:12: resource 1 [mem 0xfca00000-0xfcbfffff]
Jan 12 13:02:13 localhost kernel: pci_bus 0000:12: resource 2 [mem 0xfa000000-0xfa1fffff 64bit pref]
Jan 12 13:02:13 localhost kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22
Jan 12 13:02:13 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 12 13:02:13 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 12 13:02:13 localhost kernel: software IO TLB: mapped [mem 0x000000006b000000-0x000000006f000000] (64MB)
Jan 12 13:02:13 localhost kernel: ACPI: bus type thunderbolt registered
Jan 12 13:02:13 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 12 13:02:13 localhost kernel: Initialise system trusted keyrings
Jan 12 13:02:13 localhost kernel: Key type blacklist registered
Jan 12 13:02:13 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 12 13:02:13 localhost kernel: zbud: loaded
Jan 12 13:02:13 localhost kernel: integrity: Platform Keyring initialized
Jan 12 13:02:13 localhost kernel: integrity: Machine keyring initialized
Jan 12 13:02:13 localhost kernel: Freeing initrd memory: 87840K
Jan 12 13:02:13 localhost kernel: NET: Registered PF_ALG protocol family
Jan 12 13:02:13 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 12 13:02:13 localhost kernel: Key type asymmetric registered
Jan 12 13:02:13 localhost kernel: Asymmetric key parser 'x509' registered
Jan 12 13:02:13 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 12 13:02:13 localhost kernel: io scheduler mq-deadline registered
Jan 12 13:02:13 localhost kernel: io scheduler kyber registered
Jan 12 13:02:13 localhost kernel: io scheduler bfq registered
Jan 12 13:02:13 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31
Jan 12 13:02:13 localhost kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39
Jan 12 13:02:13 localhost kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40
Jan 12 13:02:13 localhost kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40
Jan 12 13:02:13 localhost kernel: shpchp 0000:01:00.0: HPC vendor_id 1b36 device_id e ss_vid 0 ss_did 0
Jan 12 13:02:13 localhost kernel: shpchp 0000:01:00.0: pci_hp_register failed with error -16
Jan 12 13:02:13 localhost kernel: shpchp 0000:01:00.0: Slot initialization failed
Jan 12 13:02:13 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 12 13:02:13 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 12 13:02:13 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 12 13:02:13 localhost kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21
Jan 12 13:02:13 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 12 13:02:13 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 12 13:02:13 localhost kernel: Non-volatile memory driver v1.3
Jan 12 13:02:13 localhost kernel: rdac: device handler registered
Jan 12 13:02:13 localhost kernel: hp_sw: device handler registered
Jan 12 13:02:13 localhost kernel: emc: device handler registered
Jan 12 13:02:13 localhost kernel: alua: device handler registered
Jan 12 13:02:13 localhost kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller
Jan 12 13:02:13 localhost kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1
Jan 12 13:02:13 localhost kernel: uhci_hcd 0000:02:01.0: detected 2 ports
Jan 12 13:02:13 localhost kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x0000c000
Jan 12 13:02:13 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 12 13:02:13 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 12 13:02:13 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 12 13:02:13 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-655.el9.x86_64 uhci_hcd
Jan 12 13:02:13 localhost kernel: usb usb1: SerialNumber: 0000:02:01.0
Jan 12 13:02:13 localhost kernel: hub 1-0:1.0: USB hub found
Jan 12 13:02:13 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 12 13:02:13 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 12 13:02:13 localhost kernel: usbserial: USB Serial support registered for generic
Jan 12 13:02:13 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 12 13:02:13 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 12 13:02:13 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 12 13:02:13 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 12 13:02:13 localhost kernel: rtc_cmos 00:03: RTC can wake from S4
Jan 12 13:02:13 localhost kernel: rtc_cmos 00:03: registered as rtc0
Jan 12 13:02:13 localhost kernel: rtc_cmos 00:03: setting system clock to 2026-01-12T13:02:13 UTC (1768222933)
Jan 12 13:02:13 localhost kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram
Jan 12 13:02:13 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 12 13:02:13 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 12 13:02:13 localhost kernel: usbcore: registered new interface driver usbhid
Jan 12 13:02:13 localhost kernel: usbhid: USB HID core driver
Jan 12 13:02:13 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 12 13:02:13 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 12 13:02:13 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 12 13:02:13 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 12 13:02:13 localhost kernel: Initializing XFRM netlink socket
Jan 12 13:02:13 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 12 13:02:13 localhost kernel: Segment Routing with IPv6
Jan 12 13:02:13 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 12 13:02:13 localhost kernel: mpls_gso: MPLS GSO support
Jan 12 13:02:13 localhost kernel: IPI shorthand broadcast: enabled
Jan 12 13:02:13 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 12 13:02:13 localhost kernel: AES CTR mode by8 optimization enabled
Jan 12 13:02:13 localhost kernel: sched_clock: Marking stable (914001793, 145356509)->(1268347300, -208988998)
Jan 12 13:02:13 localhost kernel: registered taskstats version 1
Jan 12 13:02:13 localhost kernel: Loading compiled-in X.509 certificates
Jan 12 13:02:13 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: cff02aed51f99e4030f8d5c362e1fce40d054fe7'
Jan 12 13:02:13 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 12 13:02:13 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 12 13:02:13 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 12 13:02:13 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 12 13:02:13 localhost kernel: Demotion targets for Node 0: null
Jan 12 13:02:13 localhost kernel: page_owner is disabled
Jan 12 13:02:13 localhost kernel: Key type .fscrypt registered
Jan 12 13:02:13 localhost kernel: Key type fscrypt-provisioning registered
Jan 12 13:02:13 localhost kernel: Key type big_key registered
Jan 12 13:02:13 localhost kernel: Key type encrypted registered
Jan 12 13:02:13 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 12 13:02:13 localhost kernel: Loading compiled-in module X.509 certificates
Jan 12 13:02:13 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: cff02aed51f99e4030f8d5c362e1fce40d054fe7'
Jan 12 13:02:13 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 12 13:02:13 localhost kernel: ima: No architecture policies found
Jan 12 13:02:13 localhost kernel: evm: Initialising EVM extended attributes:
Jan 12 13:02:13 localhost kernel: evm: security.selinux
Jan 12 13:02:13 localhost kernel: evm: security.SMACK64 (disabled)
Jan 12 13:02:13 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 12 13:02:13 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 12 13:02:13 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 12 13:02:13 localhost kernel: evm: security.apparmor (disabled)
Jan 12 13:02:13 localhost kernel: evm: security.ima
Jan 12 13:02:13 localhost kernel: evm: security.capability
Jan 12 13:02:13 localhost kernel: evm: HMAC attrs: 0x1
Jan 12 13:02:13 localhost kernel: Running certificate verification RSA selftest
Jan 12 13:02:13 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 12 13:02:13 localhost kernel: Running certificate verification ECDSA selftest
Jan 12 13:02:13 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 12 13:02:13 localhost kernel: clk: Disabling unused clocks
Jan 12 13:02:13 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 12 13:02:13 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Jan 12 13:02:13 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 12 13:02:13 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Jan 12 13:02:13 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 12 13:02:13 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 12 13:02:13 localhost kernel: Run /init as init process
Jan 12 13:02:13 localhost kernel:   with arguments:
Jan 12 13:02:13 localhost kernel:     /init
Jan 12 13:02:13 localhost kernel:   with environment:
Jan 12 13:02:13 localhost kernel:     HOME=/
Jan 12 13:02:13 localhost kernel:     TERM=linux
Jan 12 13:02:13 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-655.el9.x86_64
Jan 12 13:02:13 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 12 13:02:13 localhost systemd[1]: Detected virtualization kvm.
Jan 12 13:02:13 localhost systemd[1]: Detected architecture x86-64.
Jan 12 13:02:13 localhost systemd[1]: Running in initrd.
Jan 12 13:02:13 localhost systemd[1]: No hostname configured, using default hostname.
Jan 12 13:02:13 localhost systemd[1]: Hostname set to <localhost>.
Jan 12 13:02:13 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 12 13:02:13 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 12 13:02:13 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 12 13:02:13 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 12 13:02:13 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 12 13:02:13 localhost systemd[1]: Reached target Local File Systems.
Jan 12 13:02:13 localhost systemd[1]: Reached target Path Units.
Jan 12 13:02:13 localhost systemd[1]: Reached target Slice Units.
Jan 12 13:02:13 localhost systemd[1]: Reached target Swaps.
Jan 12 13:02:13 localhost systemd[1]: Reached target Timer Units.
Jan 12 13:02:13 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 12 13:02:13 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 12 13:02:13 localhost systemd[1]: Listening on Journal Socket.
Jan 12 13:02:13 localhost systemd[1]: Listening on udev Control Socket.
Jan 12 13:02:13 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 12 13:02:13 localhost systemd[1]: Reached target Socket Units.
Jan 12 13:02:13 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 12 13:02:13 localhost systemd[1]: Starting Journal Service...
Jan 12 13:02:13 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 12 13:02:13 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 12 13:02:13 localhost systemd[1]: Starting Create System Users...
Jan 12 13:02:13 localhost systemd[1]: Starting Setup Virtual Console...
Jan 12 13:02:13 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 12 13:02:13 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 12 13:02:13 localhost systemd[1]: Finished Create System Users.
Jan 12 13:02:13 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 12 13:02:13 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 12 13:02:13 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 12 13:02:13 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 12 13:02:13 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:02.0:00.0:01.0-1
Jan 12 13:02:13 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 12 13:02:13 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0
Jan 12 13:02:13 localhost systemd-journald[281]: Journal started
Jan 12 13:02:13 localhost systemd-journald[281]: Runtime Journal (/run/log/journal/d52817b97ba947d6a0e8c5b94b158f91) is 8.0M, max 153.6M, 145.6M free.
Jan 12 13:02:13 localhost systemd-sysusers[284]: Creating group 'users' with GID 100.
Jan 12 13:02:13 localhost systemd-sysusers[284]: Creating group 'dbus' with GID 81.
Jan 12 13:02:13 localhost systemd-sysusers[284]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 12 13:02:13 localhost systemd[1]: Started Journal Service.
Jan 12 13:02:13 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 12 13:02:13 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 12 13:02:13 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 12 13:02:13 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 12 13:02:14 localhost systemd[1]: Finished Setup Virtual Console.
Jan 12 13:02:14 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 12 13:02:14 localhost systemd[1]: Starting dracut cmdline hook...
Jan 12 13:02:14 localhost dracut-cmdline[297]: dracut-9 dracut-057-102.git20250818.el9
Jan 12 13:02:14 localhost dracut-cmdline[297]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-655.el9.x86_64 root=UUID=f2a0a5c1-133f-4977-b837-e40b31cbd9cc ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 12 13:02:14 localhost systemd[1]: Finished dracut cmdline hook.
Jan 12 13:02:14 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 12 13:02:14 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 12 13:02:14 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 12 13:02:14 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 12 13:02:14 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 12 13:02:14 localhost kernel: RPC: Registered udp transport module.
Jan 12 13:02:14 localhost kernel: RPC: Registered tcp transport module.
Jan 12 13:02:14 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 12 13:02:14 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 12 13:02:14 localhost rpc.statd[413]: Version 2.5.4 starting
Jan 12 13:02:14 localhost rpc.statd[413]: Initializing NSM state
Jan 12 13:02:14 localhost rpc.idmapd[418]: Setting log level to 0
Jan 12 13:02:14 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 12 13:02:14 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 12 13:02:14 localhost systemd-udevd[431]: Using default interface naming scheme 'rhel-9.0'.
Jan 12 13:02:14 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 12 13:02:14 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 12 13:02:14 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 12 13:02:14 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 12 13:02:14 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 12 13:02:14 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 12 13:02:14 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 12 13:02:14 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 12 13:02:14 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 12 13:02:14 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 12 13:02:14 localhost systemd[1]: Reached target Network.
Jan 12 13:02:14 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 12 13:02:14 localhost systemd[1]: Starting dracut initqueue hook...
Jan 12 13:02:14 localhost kernel: virtio_blk virtio2: 4/0/0 default/read/poll queues
Jan 12 13:02:14 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 12 13:02:14 localhost kernel:  vda: vda1
Jan 12 13:02:14 localhost systemd-udevd[460]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:02:14 localhost kernel: libata version 3.00 loaded.
Jan 12 13:02:14 localhost systemd[1]: Found device /dev/disk/by-uuid/f2a0a5c1-133f-4977-b837-e40b31cbd9cc.
Jan 12 13:02:14 localhost kernel: ahci 0000:00:1f.2: version 3.0
Jan 12 13:02:14 localhost kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16
Jan 12 13:02:14 localhost kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode
Jan 12 13:02:14 localhost kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f)
Jan 12 13:02:14 localhost kernel: ahci 0000:00:1f.2: flags: 64bit ncq only 
Jan 12 13:02:14 localhost kernel: scsi host0: ahci
Jan 12 13:02:14 localhost kernel: scsi host1: ahci
Jan 12 13:02:14 localhost kernel: scsi host2: ahci
Jan 12 13:02:14 localhost kernel: scsi host3: ahci
Jan 12 13:02:14 localhost kernel: scsi host4: ahci
Jan 12 13:02:14 localhost kernel: scsi host5: ahci
Jan 12 13:02:14 localhost kernel: ata1: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22100 irq 49 lpm-pol 0
Jan 12 13:02:14 localhost kernel: ata2: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22180 irq 49 lpm-pol 0
Jan 12 13:02:14 localhost kernel: ata3: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22200 irq 49 lpm-pol 0
Jan 12 13:02:14 localhost kernel: ata4: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22280 irq 49 lpm-pol 0
Jan 12 13:02:14 localhost kernel: ata5: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22300 irq 49 lpm-pol 0
Jan 12 13:02:14 localhost kernel: ata6: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22380 irq 49 lpm-pol 0
Jan 12 13:02:14 localhost systemd[1]: Reached target Initrd Root Device.
Jan 12 13:02:14 localhost kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300)
Jan 12 13:02:14 localhost kernel: ata2: SATA link down (SStatus 0 SControl 300)
Jan 12 13:02:14 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 12 13:02:14 localhost kernel: ata1.00: applying bridge limits
Jan 12 13:02:14 localhost kernel: ata1.00: configured for UDMA/100
Jan 12 13:02:14 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 12 13:02:14 localhost kernel: ata4: SATA link down (SStatus 0 SControl 300)
Jan 12 13:02:14 localhost kernel: ata6: SATA link down (SStatus 0 SControl 300)
Jan 12 13:02:14 localhost kernel: ata3: SATA link down (SStatus 0 SControl 300)
Jan 12 13:02:14 localhost kernel: ata5: SATA link down (SStatus 0 SControl 300)
Jan 12 13:02:14 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 12 13:02:14 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 12 13:02:14 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 12 13:02:14 localhost systemd[1]: Reached target System Initialization.
Jan 12 13:02:14 localhost systemd[1]: Reached target Basic System.
Jan 12 13:02:14 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 12 13:02:14 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 12 13:02:14 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 12 13:02:14 localhost systemd[1]: Finished dracut initqueue hook.
Jan 12 13:02:14 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 12 13:02:14 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 12 13:02:14 localhost systemd[1]: Reached target Remote File Systems.
Jan 12 13:02:14 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 12 13:02:15 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 12 13:02:15 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/f2a0a5c1-133f-4977-b837-e40b31cbd9cc...
Jan 12 13:02:15 localhost systemd-fsck[527]: /usr/sbin/fsck.xfs: XFS file system.
Jan 12 13:02:15 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/f2a0a5c1-133f-4977-b837-e40b31cbd9cc.
Jan 12 13:02:15 localhost systemd[1]: Mounting /sysroot...
Jan 12 13:02:15 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 12 13:02:15 localhost kernel: XFS (vda1): Mounting V5 Filesystem f2a0a5c1-133f-4977-b837-e40b31cbd9cc
Jan 12 13:02:15 localhost kernel: XFS (vda1): Ending clean mount
Jan 12 13:02:15 localhost systemd[1]: Mounted /sysroot.
Jan 12 13:02:15 localhost systemd[1]: Reached target Initrd Root File System.
Jan 12 13:02:15 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 12 13:02:15 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 12 13:02:15 localhost systemd[1]: Reached target Initrd File Systems.
Jan 12 13:02:15 localhost systemd[1]: Reached target Initrd Default Target.
Jan 12 13:02:15 localhost systemd[1]: Starting dracut mount hook...
Jan 12 13:02:15 localhost systemd[1]: Finished dracut mount hook.
Jan 12 13:02:15 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 12 13:02:15 localhost rpc.idmapd[418]: exiting on signal 15
Jan 12 13:02:15 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 12 13:02:15 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 12 13:02:15 localhost systemd[1]: Stopped target Network.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Timer Units.
Jan 12 13:02:15 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 12 13:02:15 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Basic System.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Path Units.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Remote File Systems.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Slice Units.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Socket Units.
Jan 12 13:02:15 localhost systemd[1]: Stopped target System Initialization.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Local File Systems.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Swaps.
Jan 12 13:02:15 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped dracut mount hook.
Jan 12 13:02:15 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 12 13:02:15 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 12 13:02:15 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 12 13:02:15 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 12 13:02:15 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 12 13:02:15 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 12 13:02:15 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 12 13:02:15 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 12 13:02:15 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 12 13:02:15 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 12 13:02:15 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 12 13:02:15 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 12 13:02:15 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Closed udev Control Socket.
Jan 12 13:02:15 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Closed udev Kernel Socket.
Jan 12 13:02:15 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 12 13:02:15 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 12 13:02:15 localhost systemd[1]: Starting Cleanup udev Database...
Jan 12 13:02:15 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 12 13:02:15 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 12 13:02:15 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Stopped Create System Users.
Jan 12 13:02:15 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 12 13:02:15 localhost systemd[1]: Finished Cleanup udev Database.
Jan 12 13:02:15 localhost systemd[1]: Reached target Switch Root.
Jan 12 13:02:15 localhost systemd[1]: Starting Switch Root...
Jan 12 13:02:15 localhost systemd[1]: Switching root.
Jan 12 13:02:15 localhost systemd-journald[281]: Journal stopped
Jan 12 13:02:16 localhost systemd-journald[281]: Received SIGTERM from PID 1 (systemd).
Jan 12 13:02:16 localhost kernel: audit: type=1404 audit(1768222935.692:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 12 13:02:16 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 12 13:02:16 localhost kernel: SELinux:  policy capability open_perms=1
Jan 12 13:02:16 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 12 13:02:16 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 12 13:02:16 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 12 13:02:16 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 12 13:02:16 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 12 13:02:16 localhost kernel: audit: type=1403 audit(1768222935.799:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 12 13:02:16 localhost systemd[1]: Successfully loaded SELinux policy in 109.984ms.
Jan 12 13:02:16 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 21.428ms.
Jan 12 13:02:16 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 12 13:02:16 localhost systemd[1]: Detected virtualization kvm.
Jan 12 13:02:16 localhost systemd[1]: Detected architecture x86-64.
Jan 12 13:02:16 localhost systemd-rc-local-generator[608]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:02:16 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 12 13:02:16 localhost systemd[1]: Stopped Switch Root.
Jan 12 13:02:16 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 12 13:02:16 localhost systemd[1]: Created slice Slice /system/getty.
Jan 12 13:02:16 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 12 13:02:16 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 12 13:02:16 localhost systemd[1]: Created slice User and Session Slice.
Jan 12 13:02:16 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 12 13:02:16 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 12 13:02:16 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 12 13:02:16 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 12 13:02:16 localhost systemd[1]: Stopped target Switch Root.
Jan 12 13:02:16 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 12 13:02:16 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 12 13:02:16 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 12 13:02:16 localhost systemd[1]: Reached target Path Units.
Jan 12 13:02:16 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 12 13:02:16 localhost systemd[1]: Reached target Slice Units.
Jan 12 13:02:16 localhost systemd[1]: Reached target Swaps.
Jan 12 13:02:16 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 12 13:02:16 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 12 13:02:16 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 12 13:02:16 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 12 13:02:16 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 12 13:02:16 localhost systemd[1]: Listening on udev Control Socket.
Jan 12 13:02:16 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 12 13:02:16 localhost systemd[1]: Mounting Huge Pages File System...
Jan 12 13:02:16 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 12 13:02:16 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 12 13:02:16 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 12 13:02:16 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 12 13:02:16 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 12 13:02:16 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 12 13:02:16 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 12 13:02:16 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 12 13:02:16 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 12 13:02:16 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 12 13:02:16 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 12 13:02:16 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 12 13:02:16 localhost systemd[1]: Stopped Journal Service.
Jan 12 13:02:16 localhost systemd[1]: Starting Journal Service...
Jan 12 13:02:16 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 12 13:02:16 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 12 13:02:16 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 12 13:02:16 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 12 13:02:16 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 12 13:02:16 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 12 13:02:16 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 12 13:02:16 localhost kernel: fuse: init (API version 7.37)
Jan 12 13:02:16 localhost systemd[1]: Mounted Huge Pages File System.
Jan 12 13:02:16 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 12 13:02:16 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 12 13:02:16 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 12 13:02:16 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 12 13:02:16 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 12 13:02:16 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 12 13:02:16 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 12 13:02:16 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 12 13:02:16 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 12 13:02:16 localhost systemd-journald[649]: Journal started
Jan 12 13:02:16 localhost systemd-journald[649]: Runtime Journal (/run/log/journal/bfa963f84c4f244b9e78b91a43b5e88e) is 8.0M, max 153.6M, 145.6M free.
Jan 12 13:02:16 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 12 13:02:16 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 12 13:02:16 localhost systemd[1]: Started Journal Service.
Jan 12 13:02:16 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 12 13:02:16 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 12 13:02:16 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 12 13:02:16 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 12 13:02:16 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 12 13:02:16 localhost kernel: ACPI: bus type drm_connector registered
Jan 12 13:02:16 localhost systemd[1]: Mounting FUSE Control File System...
Jan 12 13:02:16 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 12 13:02:16 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 12 13:02:16 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 12 13:02:16 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 12 13:02:16 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 12 13:02:16 localhost systemd-journald[649]: Runtime Journal (/run/log/journal/bfa963f84c4f244b9e78b91a43b5e88e) is 8.0M, max 153.6M, 145.6M free.
Jan 12 13:02:16 localhost systemd-journald[649]: Received client request to flush runtime journal.
Jan 12 13:02:16 localhost systemd[1]: Starting Create System Users...
Jan 12 13:02:16 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 12 13:02:16 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 12 13:02:16 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 12 13:02:16 localhost systemd[1]: Mounted FUSE Control File System.
Jan 12 13:02:16 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 12 13:02:16 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 12 13:02:16 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 12 13:02:16 localhost systemd[1]: Finished Create System Users.
Jan 12 13:02:16 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 12 13:02:16 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 12 13:02:16 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 12 13:02:16 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 12 13:02:16 localhost systemd[1]: Reached target Local File Systems.
Jan 12 13:02:16 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 12 13:02:16 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 12 13:02:16 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 12 13:02:16 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 12 13:02:16 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 12 13:02:16 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 12 13:02:16 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 12 13:02:16 localhost bootctl[667]: Couldn't find EFI system partition, skipping.
Jan 12 13:02:16 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 12 13:02:16 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 12 13:02:16 localhost systemd[1]: Starting Security Auditing Service...
Jan 12 13:02:16 localhost systemd[1]: Starting RPC Bind...
Jan 12 13:02:16 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 12 13:02:16 localhost auditd[673]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 12 13:02:16 localhost auditd[673]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 12 13:02:16 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 12 13:02:16 localhost systemd[1]: Started RPC Bind.
Jan 12 13:02:16 localhost augenrules[678]: /sbin/augenrules: No change
Jan 12 13:02:16 localhost augenrules[693]: No rules
Jan 12 13:02:16 localhost augenrules[693]: enabled 1
Jan 12 13:02:16 localhost augenrules[693]: failure 1
Jan 12 13:02:16 localhost augenrules[693]: pid 673
Jan 12 13:02:16 localhost augenrules[693]: rate_limit 0
Jan 12 13:02:16 localhost augenrules[693]: backlog_limit 8192
Jan 12 13:02:16 localhost augenrules[693]: lost 0
Jan 12 13:02:16 localhost augenrules[693]: backlog 4
Jan 12 13:02:16 localhost augenrules[693]: backlog_wait_time 60000
Jan 12 13:02:16 localhost augenrules[693]: backlog_wait_time_actual 0
Jan 12 13:02:16 localhost augenrules[693]: enabled 1
Jan 12 13:02:16 localhost augenrules[693]: failure 1
Jan 12 13:02:16 localhost augenrules[693]: pid 673
Jan 12 13:02:16 localhost augenrules[693]: rate_limit 0
Jan 12 13:02:16 localhost augenrules[693]: backlog_limit 8192
Jan 12 13:02:16 localhost augenrules[693]: lost 0
Jan 12 13:02:16 localhost augenrules[693]: backlog 2
Jan 12 13:02:16 localhost augenrules[693]: backlog_wait_time 60000
Jan 12 13:02:16 localhost augenrules[693]: backlog_wait_time_actual 0
Jan 12 13:02:16 localhost augenrules[693]: enabled 1
Jan 12 13:02:16 localhost augenrules[693]: failure 1
Jan 12 13:02:16 localhost augenrules[693]: pid 673
Jan 12 13:02:16 localhost augenrules[693]: rate_limit 0
Jan 12 13:02:16 localhost augenrules[693]: backlog_limit 8192
Jan 12 13:02:16 localhost augenrules[693]: lost 0
Jan 12 13:02:16 localhost augenrules[693]: backlog 1
Jan 12 13:02:16 localhost augenrules[693]: backlog_wait_time 60000
Jan 12 13:02:16 localhost augenrules[693]: backlog_wait_time_actual 0
Jan 12 13:02:16 localhost systemd[1]: Started Security Auditing Service.
Jan 12 13:02:16 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 12 13:02:16 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 12 13:02:16 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 12 13:02:16 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 12 13:02:16 localhost systemd-udevd[701]: Using default interface naming scheme 'rhel-9.0'.
Jan 12 13:02:16 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 12 13:02:16 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 12 13:02:16 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 12 13:02:16 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 12 13:02:16 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 12 13:02:16 localhost systemd-udevd[703]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:02:16 localhost kernel: lpc_ich 0000:00:1f.0: I/O space for GPIO uninitialized
Jan 12 13:02:16 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 12 13:02:16 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 12 13:02:16 localhost systemd[1]: Starting Update is Completed...
Jan 12 13:02:16 localhost systemd[1]: Finished Update is Completed.
Jan 12 13:02:16 localhost kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt
Jan 12 13:02:16 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 12 13:02:16 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 12 13:02:16 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:01.0
Jan 12 13:02:16 localhost kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console
Jan 12 13:02:16 localhost kernel: iTCO_vendor_support: vendor-support=0
Jan 12 13:02:16 localhost kernel: iTCO_wdt iTCO_wdt.1.auto: Found a ICH9 TCO device (Version=2, TCOBASE=0x0660)
Jan 12 13:02:16 localhost kernel: Console: switching to colour dummy device 80x25
Jan 12 13:02:16 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 12 13:02:16 localhost kernel: [drm] features: -context_init
Jan 12 13:02:16 localhost kernel: [drm] number of scanouts: 1
Jan 12 13:02:16 localhost kernel: [drm] number of cap sets: 0
Jan 12 13:02:16 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0
Jan 12 13:02:16 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 12 13:02:16 localhost kernel: iTCO_wdt iTCO_wdt.1.auto: initialized. heartbeat=30 sec (nowayout=0)
Jan 12 13:02:16 localhost kernel: Console: switching to colour frame buffer device 160x50
Jan 12 13:02:16 localhost kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 12 13:02:17 localhost kernel: kvm_amd: TSC scaling supported
Jan 12 13:02:17 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 12 13:02:17 localhost kernel: kvm_amd: Nested Paging enabled
Jan 12 13:02:17 localhost kernel: kvm_amd: LBR virtualization supported
Jan 12 13:02:17 localhost kernel: kvm_amd: Virtual VMLOAD VMSAVE supported
Jan 12 13:02:17 localhost kernel: kvm_amd: Virtual GIF supported
Jan 12 13:02:17 localhost systemd[1]: Reached target System Initialization.
Jan 12 13:02:17 localhost systemd[1]: Started dnf makecache --timer.
Jan 12 13:02:17 localhost systemd[1]: Started Daily rotation of log files.
Jan 12 13:02:17 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 12 13:02:17 localhost systemd[1]: Reached target Timer Units.
Jan 12 13:02:17 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 12 13:02:17 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 12 13:02:17 localhost systemd[1]: Reached target Socket Units.
Jan 12 13:02:17 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 12 13:02:17 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 12 13:02:17 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 12 13:02:17 localhost systemd[1]: Reached target Basic System.
Jan 12 13:02:17 localhost dbus-broker-lau[765]: Ready
Jan 12 13:02:17 localhost systemd[1]: Starting NTP client/server...
Jan 12 13:02:17 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 12 13:02:17 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 12 13:02:17 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 12 13:02:17 localhost systemd[1]: Started irqbalance daemon.
Jan 12 13:02:17 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 12 13:02:17 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 12 13:02:17 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 12 13:02:17 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 12 13:02:17 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 12 13:02:17 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 12 13:02:17 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 12 13:02:17 localhost systemd[1]: Starting User Login Management...
Jan 12 13:02:17 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 12 13:02:17 localhost chronyd[783]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 12 13:02:17 localhost chronyd[783]: Loaded 0 symmetric keys
Jan 12 13:02:17 localhost chronyd[783]: Using right/UTC timezone to obtain leap second data
Jan 12 13:02:17 localhost chronyd[783]: Loaded seccomp filter (level 2)
Jan 12 13:02:17 localhost systemd[1]: Started NTP client/server.
Jan 12 13:02:17 localhost systemd-logind[775]: New seat seat0.
Jan 12 13:02:17 localhost systemd-logind[775]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 12 13:02:17 localhost systemd-logind[775]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 12 13:02:17 localhost systemd[1]: Started User Login Management.
Jan 12 13:02:17 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 12 13:02:17 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 12 13:02:17 localhost iptables.init[770]: iptables: Applying firewall rules: [  OK  ]
Jan 12 13:02:17 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 12 13:02:17 localhost cloud-init[793]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 12 Jan 2026 13:02:17 +0000. Up 5.05 seconds.
Jan 12 13:02:17 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 12 13:02:17 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 12 13:02:17 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpj5qwjttn.mount: Deactivated successfully.
Jan 12 13:02:17 localhost systemd[1]: Starting Hostname Service...
Jan 12 13:02:17 localhost systemd[1]: Started Hostname Service.
Jan 12 13:02:17 np0005581840 systemd-hostnamed[807]: Hostname set to <np0005581840> (static)
Jan 12 13:02:17 np0005581840 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 12 13:02:17 np0005581840 systemd[1]: Reached target Preparation for Network.
Jan 12 13:02:18 np0005581840 systemd[1]: Starting Network Manager...
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0344] NetworkManager (version 1.54.2-1.el9) is starting... (boot:94c17a7f-65c5-449e-af1b-e34d5bf3c7ea)
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0347] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0432] manager[0x5566a0d61000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0457] hostname: hostname: using hostnamed
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0458] hostname: static hostname changed from (none) to "np0005581840"
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0460] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0519] manager[0x5566a0d61000]: rfkill: Wi-Fi hardware radio set enabled
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0519] manager[0x5566a0d61000]: rfkill: WWAN hardware radio set enabled
Jan 12 13:02:18 np0005581840 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0571] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0571] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0571] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0571] manager: Networking is enabled by state file
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0572] settings: Loaded settings plugin: keyfile (internal)
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0586] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0605] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0618] dhcp: init: Using DHCP client 'internal'
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0620] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0631] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0641] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0648] device (lo): Activation: starting connection 'lo' (32e0de59-2ecd-409f-9e28-312d4eb0815d)
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0655] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0659] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0679] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0687] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0690] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0692] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0697] device (eth0): carrier: link connected
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0700] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0705] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 12 13:02:18 np0005581840 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0710] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0713] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0714] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0718] manager: NetworkManager state is now CONNECTING
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0720] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:02:18 np0005581840 systemd[1]: Started Network Manager.
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0724] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0730] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:02:18 np0005581840 systemd[1]: Reached target Network.
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0735] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Jan 12 13:02:18 np0005581840 systemd[1]: Starting Network Manager Wait Online...
Jan 12 13:02:18 np0005581840 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0787] dhcp4 (eth0): state changed new lease, address=192.168.25.114
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0793] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 12 13:02:18 np0005581840 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0842] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0845] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 12 13:02:18 np0005581840 NetworkManager[811]: <info>  [1768222938.0851] device (lo): Activation: successful, device activated.
Jan 12 13:02:18 np0005581840 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 12 13:02:18 np0005581840 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 12 13:02:18 np0005581840 systemd[1]: Reached target NFS client services.
Jan 12 13:02:18 np0005581840 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 12 13:02:18 np0005581840 systemd[1]: Reached target Remote File Systems.
Jan 12 13:02:18 np0005581840 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 12 13:02:19 np0005581840 NetworkManager[811]: <info>  [1768222939.6597] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:02:20 np0005581840 NetworkManager[811]: <info>  [1768222940.7446] dhcp6 (eth0): state changed new lease, address=2001:db8::10a
Jan 12 13:02:21 np0005581840 NetworkManager[811]: <info>  [1768222941.9644] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:02:21 np0005581840 NetworkManager[811]: <info>  [1768222941.9682] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:02:21 np0005581840 NetworkManager[811]: <info>  [1768222941.9684] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:02:21 np0005581840 NetworkManager[811]: <info>  [1768222941.9690] manager: NetworkManager state is now CONNECTED_SITE
Jan 12 13:02:21 np0005581840 NetworkManager[811]: <info>  [1768222941.9695] device (eth0): Activation: successful, device activated.
Jan 12 13:02:21 np0005581840 NetworkManager[811]: <info>  [1768222941.9701] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 12 13:02:21 np0005581840 NetworkManager[811]: <info>  [1768222941.9705] manager: startup complete
Jan 12 13:02:21 np0005581840 systemd[1]: Finished Network Manager Wait Online.
Jan 12 13:02:21 np0005581840 systemd[1]: Starting Cloud-init: Network Stage...
Jan 12 13:02:22 np0005581840 cloud-init[878]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 12 Jan 2026 13:02:22 +0000. Up 9.62 seconds.
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |  eth0  | True |        192.168.25.114        | 255.255.255.0 | global | fa:16:3e:e0:39:44 |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |  eth0  | True |      2001:db8::10a/128       |       .       | global | fa:16:3e:e0:39:44 |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |  eth0  | True | fe80::f816:3eff:fee0:3944/64 |       .       |  link  | fa:16:3e:e0:39:44 |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: | Route |   Destination   |   Gateway    |     Genmask     | Interface | Flags |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   0   |     0.0.0.0     | 192.168.25.1 |     0.0.0.0     |    eth0   |   UG  |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   1   | 169.254.169.254 | 192.168.25.2 | 255.255.255.255 |    eth0   |  UGH  |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   2   |   192.168.25.0  |   0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: ++++++++++++++++++++++Route IPv6 info++++++++++++++++++++++
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +-------+---------------+-------------+-----------+-------+
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: | Route |  Destination  |   Gateway   | Interface | Flags |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +-------+---------------+-------------+-----------+-------+
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   1   |  2001:db8::1  |      ::     |    eth0   |   U   |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   2   | 2001:db8::10a |      ::     |    eth0   |   U   |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   3   |   fe80::/64   |      ::     |    eth0   |   U   |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   4   |      ::/0     | 2001:db8::1 |    eth0   |   UG  |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   6   |     local     |      ::     |    eth0   |   U   |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   7   |     local     |      ::     |    eth0   |   U   |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: |   8   |   multicast   |      ::     |    eth0   |   U   |
Jan 12 13:02:22 np0005581840 cloud-init[878]: ci-info: +-------+---------------+-------------+-----------+-------+
Jan 12 13:02:22 np0005581840 useradd[945]: new group: name=cloud-user, GID=1001
Jan 12 13:02:22 np0005581840 useradd[945]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 12 13:02:22 np0005581840 useradd[945]: add 'cloud-user' to group 'adm'
Jan 12 13:02:22 np0005581840 useradd[945]: add 'cloud-user' to group 'systemd-journal'
Jan 12 13:02:22 np0005581840 useradd[945]: add 'cloud-user' to shadow group 'adm'
Jan 12 13:02:22 np0005581840 useradd[945]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 12 13:02:23 np0005581840 chronyd[783]: Selected source 99.28.14.242 (2.centos.pool.ntp.org)
Jan 12 13:02:23 np0005581840 chronyd[783]: System clock wrong by 1.322235 seconds
Jan 12 13:02:24 np0005581840 chronyd[783]: System clock was stepped by 1.322235 seconds
Jan 12 13:02:24 np0005581840 chronyd[783]: System clock TAI offset set to 37 seconds
Jan 12 13:02:24 np0005581840 cloud-init[878]: Generating public/private rsa key pair.
Jan 12 13:02:24 np0005581840 cloud-init[878]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 12 13:02:24 np0005581840 cloud-init[878]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 12 13:02:24 np0005581840 cloud-init[878]: The key fingerprint is:
Jan 12 13:02:24 np0005581840 cloud-init[878]: SHA256:tgdyQHrrArW36wqCp9n2hujik6gKobna/xzNfc0lXAM root@np0005581840
Jan 12 13:02:24 np0005581840 cloud-init[878]: The key's randomart image is:
Jan 12 13:02:24 np0005581840 cloud-init[878]: +---[RSA 3072]----+
Jan 12 13:02:24 np0005581840 cloud-init[878]: |      .      E   |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |     o        .  |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |    o o        ..|
Jan 12 13:02:24 np0005581840 cloud-init[878]: |   . o o     . ..|
Jan 12 13:02:24 np0005581840 cloud-init[878]: |. . . + S     o .|
Jan 12 13:02:24 np0005581840 cloud-init[878]: |oo . o B +   o o |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |B.+.. + + o . o  |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |=@o..o o . .     |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |#+++++=          |
Jan 12 13:02:24 np0005581840 cloud-init[878]: +----[SHA256]-----+
Jan 12 13:02:24 np0005581840 cloud-init[878]: Generating public/private ecdsa key pair.
Jan 12 13:02:24 np0005581840 cloud-init[878]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 12 13:02:24 np0005581840 cloud-init[878]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 12 13:02:24 np0005581840 cloud-init[878]: The key fingerprint is:
Jan 12 13:02:24 np0005581840 cloud-init[878]: SHA256:n2Ouo9rDsM65uCvn1wMNsPIoUExmfrxXpFgFimtksZo root@np0005581840
Jan 12 13:02:24 np0005581840 cloud-init[878]: The key's randomart image is:
Jan 12 13:02:24 np0005581840 cloud-init[878]: +---[ECDSA 256]---+
Jan 12 13:02:24 np0005581840 cloud-init[878]: | o=   ooo        |
Jan 12 13:02:24 np0005581840 cloud-init[878]: | ++= + o         |
Jan 12 13:02:24 np0005581840 cloud-init[878]: | .*o= . .        |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |o=.o.. .         |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |E+o .o. S        |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |o.. o..  . .     |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |.    *    =      |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |. .ooo= .o .     |
Jan 12 13:02:24 np0005581840 cloud-init[878]: | +==*oo+.o.      |
Jan 12 13:02:24 np0005581840 cloud-init[878]: +----[SHA256]-----+
Jan 12 13:02:24 np0005581840 cloud-init[878]: Generating public/private ed25519 key pair.
Jan 12 13:02:24 np0005581840 cloud-init[878]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 12 13:02:24 np0005581840 cloud-init[878]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 12 13:02:24 np0005581840 cloud-init[878]: The key fingerprint is:
Jan 12 13:02:24 np0005581840 cloud-init[878]: SHA256:3nYdBj8cn9/vujiVu9aN/WReK+U5Fe41ljxANVjAg9g root@np0005581840
Jan 12 13:02:24 np0005581840 cloud-init[878]: The key's randomart image is:
Jan 12 13:02:24 np0005581840 cloud-init[878]: +--[ED25519 256]--+
Jan 12 13:02:24 np0005581840 cloud-init[878]: |          o o.=+ |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |         . E =  .|
Jan 12 13:02:24 np0005581840 cloud-init[878]: |            o o  |
Jan 12 13:02:24 np0005581840 cloud-init[878]: |             = +.|
Jan 12 13:02:24 np0005581840 cloud-init[878]: |        S     Xo+|
Jan 12 13:02:24 np0005581840 cloud-init[878]: |       . .   oo@=|
Jan 12 13:02:24 np0005581840 cloud-init[878]: |        . o ..*=&|
Jan 12 13:02:24 np0005581840 cloud-init[878]: |         . ..o+OB|
Jan 12 13:02:24 np0005581840 cloud-init[878]: |            .o=*B|
Jan 12 13:02:24 np0005581840 cloud-init[878]: +----[SHA256]-----+
Jan 12 13:02:24 np0005581840 systemd[1]: Finished Cloud-init: Network Stage.
Jan 12 13:02:24 np0005581840 systemd[1]: Reached target Cloud-config availability.
Jan 12 13:02:24 np0005581840 systemd[1]: Reached target Network is Online.
Jan 12 13:02:24 np0005581840 systemd[1]: Starting Cloud-init: Config Stage...
Jan 12 13:02:24 np0005581840 systemd[1]: Starting Crash recovery kernel arming...
Jan 12 13:02:24 np0005581840 systemd[1]: Starting Notify NFS peers of a restart...
Jan 12 13:02:24 np0005581840 systemd[1]: Starting System Logging Service...
Jan 12 13:02:24 np0005581840 sm-notify[961]: Version 2.5.4 starting
Jan 12 13:02:24 np0005581840 systemd[1]: Starting OpenSSH server daemon...
Jan 12 13:02:24 np0005581840 systemd[1]: Starting Permit User Sessions...
Jan 12 13:02:24 np0005581840 sshd[963]: Server listening on 0.0.0.0 port 22.
Jan 12 13:02:24 np0005581840 sshd[963]: Server listening on :: port 22.
Jan 12 13:02:24 np0005581840 systemd[1]: Started OpenSSH server daemon.
Jan 12 13:02:24 np0005581840 systemd[1]: Started Notify NFS peers of a restart.
Jan 12 13:02:24 np0005581840 rsyslogd[962]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="962" x-info="https://www.rsyslog.com"] start
Jan 12 13:02:24 np0005581840 systemd[1]: Started System Logging Service.
Jan 12 13:02:24 np0005581840 rsyslogd[962]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 12 13:02:24 np0005581840 systemd[1]: Finished Permit User Sessions.
Jan 12 13:02:24 np0005581840 systemd[1]: Started Command Scheduler.
Jan 12 13:02:24 np0005581840 sshd-session[972]: Unable to negotiate with 192.168.25.11 port 38082: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 12 13:02:24 np0005581840 systemd[1]: Started Getty on tty1.
Jan 12 13:02:24 np0005581840 crond[973]: (CRON) STARTUP (1.5.7)
Jan 12 13:02:24 np0005581840 crond[973]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 12 13:02:24 np0005581840 crond[973]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 92% if used.)
Jan 12 13:02:24 np0005581840 crond[973]: (CRON) INFO (running with inotify support)
Jan 12 13:02:24 np0005581840 systemd[1]: Started Serial Getty on ttyS0.
Jan 12 13:02:24 np0005581840 systemd[1]: Reached target Login Prompts.
Jan 12 13:02:24 np0005581840 systemd[1]: Reached target Multi-User System.
Jan 12 13:02:24 np0005581840 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 12 13:02:24 np0005581840 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 12 13:02:24 np0005581840 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 12 13:02:24 np0005581840 sshd-session[986]: Unable to negotiate with 192.168.25.11 port 38094: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 12 13:02:24 np0005581840 sshd-session[992]: Unable to negotiate with 192.168.25.11 port 38106: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 12 13:02:24 np0005581840 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:02:24 np0005581840 sshd-session[1014]: Unable to negotiate with 192.168.25.11 port 38132: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 12 13:02:24 np0005581840 sshd-session[965]: Connection closed by 192.168.25.11 port 38070 [preauth]
Jan 12 13:02:24 np0005581840 sshd-session[1020]: Unable to negotiate with 192.168.25.11 port 38138: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 12 13:02:24 np0005581840 sshd-session[979]: Connection closed by 192.168.25.11 port 38090 [preauth]
Jan 12 13:02:24 np0005581840 kdumpctl[977]: kdump: No kdump initial ramdisk found.
Jan 12 13:02:24 np0005581840 kdumpctl[977]: kdump: Rebuilding /boot/initramfs-5.14.0-655.el9.x86_64kdump.img
Jan 12 13:02:24 np0005581840 sshd-session[995]: Connection closed by 192.168.25.11 port 38122 [preauth]
Jan 12 13:02:24 np0005581840 sshd-session[1004]: Connection closed by 192.168.25.11 port 38124 [preauth]
Jan 12 13:02:24 np0005581840 cloud-init[1107]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 12 Jan 2026 13:02:24 +0000. Up 10.83 seconds.
Jan 12 13:02:24 np0005581840 systemd[1]: Finished Cloud-init: Config Stage.
Jan 12 13:02:24 np0005581840 systemd[1]: Starting Cloud-init: Final Stage...
Jan 12 13:02:25 np0005581840 dracut[1240]: dracut-057-102.git20250818.el9
Jan 12 13:02:25 np0005581840 cloud-init[1258]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 12 Jan 2026 13:02:25 +0000. Up 11.15 seconds.
Jan 12 13:02:25 np0005581840 cloud-init[1261]: #############################################################
Jan 12 13:02:25 np0005581840 cloud-init[1263]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 12 13:02:25 np0005581840 cloud-init[1269]: 256 SHA256:n2Ouo9rDsM65uCvn1wMNsPIoUExmfrxXpFgFimtksZo root@np0005581840 (ECDSA)
Jan 12 13:02:25 np0005581840 cloud-init[1274]: 256 SHA256:3nYdBj8cn9/vujiVu9aN/WReK+U5Fe41ljxANVjAg9g root@np0005581840 (ED25519)
Jan 12 13:02:25 np0005581840 dracut[1242]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/f2a0a5c1-133f-4977-b837-e40b31cbd9cc /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-655.el9.x86_64kdump.img 5.14.0-655.el9.x86_64
Jan 12 13:02:25 np0005581840 cloud-init[1281]: 3072 SHA256:tgdyQHrrArW36wqCp9n2hujik6gKobna/xzNfc0lXAM root@np0005581840 (RSA)
Jan 12 13:02:25 np0005581840 cloud-init[1284]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 12 13:02:25 np0005581840 cloud-init[1285]: #############################################################
Jan 12 13:02:25 np0005581840 cloud-init[1258]: Cloud-init v. 24.4-8.el9 finished at Mon, 12 Jan 2026 13:02:25 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.28 seconds
Jan 12 13:02:25 np0005581840 systemd[1]: Finished Cloud-init: Final Stage.
Jan 12 13:02:25 np0005581840 systemd[1]: Reached target Cloud-init target.
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 12 13:02:25 np0005581840 dracut[1242]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 12 13:02:25 np0005581840 dracut[1242]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 12 13:02:25 np0005581840 dracut[1242]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 12 13:02:25 np0005581840 dracut[1242]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: memstrack is not available
Jan 12 13:02:26 np0005581840 dracut[1242]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 12 13:02:26 np0005581840 dracut[1242]: memstrack is not available
Jan 12 13:02:26 np0005581840 dracut[1242]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 12 13:02:26 np0005581840 dracut[1242]: *** Including module: systemd ***
Jan 12 13:02:26 np0005581840 dracut[1242]: *** Including module: fips ***
Jan 12 13:02:26 np0005581840 dracut[1242]: *** Including module: systemd-initrd ***
Jan 12 13:02:26 np0005581840 dracut[1242]: *** Including module: i18n ***
Jan 12 13:02:26 np0005581840 dracut[1242]: *** Including module: drm ***
Jan 12 13:02:27 np0005581840 dracut[1242]: *** Including module: prefixdevname ***
Jan 12 13:02:27 np0005581840 dracut[1242]: *** Including module: kernel-modules ***
Jan 12 13:02:27 np0005581840 kernel: block vda: the capability attribute has been deprecated.
Jan 12 13:02:27 np0005581840 dracut[1242]: *** Including module: kernel-modules-extra ***
Jan 12 13:02:27 np0005581840 dracut[1242]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 12 13:02:27 np0005581840 dracut[1242]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 12 13:02:27 np0005581840 dracut[1242]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 12 13:02:27 np0005581840 dracut[1242]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 12 13:02:27 np0005581840 dracut[1242]: *** Including module: qemu ***
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: fstab-sys ***
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: rootfs-block ***
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: terminfo ***
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: udev-rules ***
Jan 12 13:02:28 np0005581840 dracut[1242]: Skipping udev rule: 91-permissions.rules
Jan 12 13:02:28 np0005581840 dracut[1242]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: virtiofs ***
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: dracut-systemd ***
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: usrmount ***
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: base ***
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: fs-lib ***
Jan 12 13:02:28 np0005581840 dracut[1242]: *** Including module: kdumpbase ***
Jan 12 13:02:28 np0005581840 irqbalance[771]: Cannot change IRQ 45 affinity: Operation not permitted
Jan 12 13:02:28 np0005581840 irqbalance[771]: IRQ 45 affinity is now unmanaged
Jan 12 13:02:28 np0005581840 irqbalance[771]: Cannot change IRQ 44 affinity: Operation not permitted
Jan 12 13:02:28 np0005581840 irqbalance[771]: IRQ 44 affinity is now unmanaged
Jan 12 13:02:28 np0005581840 irqbalance[771]: Cannot change IRQ 42 affinity: Operation not permitted
Jan 12 13:02:28 np0005581840 irqbalance[771]: IRQ 42 affinity is now unmanaged
Jan 12 13:02:29 np0005581840 dracut[1242]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 12 13:02:29 np0005581840 dracut[1242]:   microcode_ctl module: mangling fw_dir
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 12 13:02:29 np0005581840 dracut[1242]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 12 13:02:29 np0005581840 dracut[1242]: *** Including module: openssl ***
Jan 12 13:02:29 np0005581840 dracut[1242]: *** Including module: shutdown ***
Jan 12 13:02:29 np0005581840 dracut[1242]: *** Including module: squash ***
Jan 12 13:02:29 np0005581840 dracut[1242]: *** Including modules done ***
Jan 12 13:02:29 np0005581840 dracut[1242]: *** Installing kernel module dependencies ***
Jan 12 13:02:30 np0005581840 dracut[1242]: *** Installing kernel module dependencies done ***
Jan 12 13:02:30 np0005581840 dracut[1242]: *** Resolving executable dependencies ***
Jan 12 13:02:31 np0005581840 dracut[1242]: *** Resolving executable dependencies done ***
Jan 12 13:02:31 np0005581840 dracut[1242]: *** Generating early-microcode cpio image ***
Jan 12 13:02:31 np0005581840 dracut[1242]: *** Store current command line parameters ***
Jan 12 13:02:31 np0005581840 dracut[1242]: Stored kernel commandline:
Jan 12 13:02:31 np0005581840 dracut[1242]: No dracut internal kernel commandline stored in the initramfs
Jan 12 13:02:31 np0005581840 dracut[1242]: *** Install squash loader ***
Jan 12 13:02:32 np0005581840 dracut[1242]: *** Squashing the files inside the initramfs ***
Jan 12 13:02:33 np0005581840 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 12 13:02:33 np0005581840 dracut[1242]: *** Squashing the files inside the initramfs done ***
Jan 12 13:02:33 np0005581840 dracut[1242]: *** Creating image file '/boot/initramfs-5.14.0-655.el9.x86_64kdump.img' ***
Jan 12 13:02:33 np0005581840 dracut[1242]: *** Hardlinking files ***
Jan 12 13:02:33 np0005581840 dracut[1242]: Mode:           real
Jan 12 13:02:33 np0005581840 dracut[1242]: Files:          50
Jan 12 13:02:33 np0005581840 dracut[1242]: Linked:         0 files
Jan 12 13:02:33 np0005581840 dracut[1242]: Compared:       0 xattrs
Jan 12 13:02:33 np0005581840 dracut[1242]: Compared:       0 files
Jan 12 13:02:33 np0005581840 dracut[1242]: Saved:          0 B
Jan 12 13:02:33 np0005581840 dracut[1242]: Duration:       0.000437 seconds
Jan 12 13:02:33 np0005581840 dracut[1242]: *** Hardlinking files done ***
Jan 12 13:02:33 np0005581840 dracut[1242]: *** Creating initramfs image file '/boot/initramfs-5.14.0-655.el9.x86_64kdump.img' done ***
Jan 12 13:02:34 np0005581840 kdumpctl[977]: kdump: kexec: loaded kdump kernel
Jan 12 13:02:34 np0005581840 kdumpctl[977]: kdump: Starting kdump: [OK]
Jan 12 13:02:34 np0005581840 systemd[1]: Finished Crash recovery kernel arming.
Jan 12 13:02:34 np0005581840 systemd[1]: Startup finished in 1.143s (kernel) + 1.933s (initrd) + 17.088s (userspace) = 20.164s.
Jan 12 13:02:49 np0005581840 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 12 13:03:51 np0005581840 sshd-session[4369]: Accepted publickey for zuul from 192.168.25.12 port 57328 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 12 13:03:51 np0005581840 systemd-logind[775]: New session 1 of user zuul.
Jan 12 13:03:51 np0005581840 systemd[1]: Created slice User Slice of UID 1000.
Jan 12 13:03:51 np0005581840 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 12 13:03:51 np0005581840 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 12 13:03:51 np0005581840 systemd[1]: Starting User Manager for UID 1000...
Jan 12 13:03:51 np0005581840 systemd[4373]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:03:52 np0005581840 systemd[4373]: Queued start job for default target Main User Target.
Jan 12 13:03:52 np0005581840 systemd[4373]: Created slice User Application Slice.
Jan 12 13:03:52 np0005581840 systemd[4373]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 12 13:03:52 np0005581840 systemd[4373]: Started Daily Cleanup of User's Temporary Directories.
Jan 12 13:03:52 np0005581840 systemd[4373]: Reached target Paths.
Jan 12 13:03:52 np0005581840 systemd[4373]: Reached target Timers.
Jan 12 13:03:52 np0005581840 systemd[4373]: Starting D-Bus User Message Bus Socket...
Jan 12 13:03:52 np0005581840 systemd[4373]: Starting Create User's Volatile Files and Directories...
Jan 12 13:03:52 np0005581840 systemd[4373]: Listening on D-Bus User Message Bus Socket.
Jan 12 13:03:52 np0005581840 systemd[4373]: Reached target Sockets.
Jan 12 13:03:52 np0005581840 systemd[4373]: Finished Create User's Volatile Files and Directories.
Jan 12 13:03:52 np0005581840 systemd[4373]: Reached target Basic System.
Jan 12 13:03:52 np0005581840 systemd[4373]: Reached target Main User Target.
Jan 12 13:03:52 np0005581840 systemd[4373]: Startup finished in 93ms.
Jan 12 13:03:52 np0005581840 systemd[1]: Started User Manager for UID 1000.
Jan 12 13:03:52 np0005581840 systemd[1]: Started Session 1 of User zuul.
Jan 12 13:03:52 np0005581840 sshd-session[4369]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:03:52 np0005581840 python3[4455]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:03:54 np0005581840 python3[4483]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:03:58 np0005581840 python3[4537]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:03:59 np0005581840 python3[4577]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 12 13:04:00 np0005581840 python3[4603]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDfVoaKsL7MFdVws47GWi5YkrAqMWzOkl63YjnSHmEw5ZpdAOdOwUrNQr9vkPijfrB2NZzEJ0MMtpP06bEFE8f/p3GpdmIPnouz0cdIoC6JrfmtBVtUKOuq+BCVmemKLxjAxzcLyYelRYzn/oPg/nms4/A8KpidLQCkVyobUEseROvHmtJ2GngBCMXRKt7QX2wJEfvZjlMHGqbrxBkxPu/f1aa1wjFRYrO2r8rVeiDe0JiwbqXAYZ4oHtSVaJQk9wgLAXzWM9W0IuRn9OUNjWtdk3UhBy4CHrUOOGBJ3UXVJ3Ug6xuGPYqZOai8SYha/f0+YNsfvvw9cHxmzlWcsjuHWPxIzkgLSIZP42rAT7Fi8EC40WGHrHc0PBVsLQBKZerUT0oIs84+YAHRn4G2bnn2HctBu8k9dVdwRimLK0oeZ5GMC3WyXqw4YYQr8X+wGytiNgf7NPSAYFdqeglEzEGoeCxhMDeY1tizuQaFD6MNfQxyeAh3GZ4KVGl14P2+Jd8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:01 np0005581840 python3[4627]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:01 np0005581840 python3[4726]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:04:01 np0005581840 python3[4797]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768223041.3510582-207-231688722181499/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=edf636d30a9e4bc9984e33c60dbaec96_id_rsa follow=False checksum=9b63ca4537d42ba254dacecfdd25667fd7e7f836 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:02 np0005581840 python3[4920]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:04:02 np0005581840 python3[4991]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768223041.943604-240-18975549402430/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=edf636d30a9e4bc9984e33c60dbaec96_id_rsa.pub follow=False checksum=435f877adfe1219c01134c49f2f4b47750223fd9 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:03 np0005581840 python3[5039]: ansible-ping Invoked with data=pong
Jan 12 13:04:04 np0005581840 python3[5063]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:04:05 np0005581840 python3[5117]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 12 13:04:06 np0005581840 python3[5149]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:06 np0005581840 python3[5173]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:06 np0005581840 python3[5197]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:06 np0005581840 python3[5221]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:07 np0005581840 python3[5245]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:07 np0005581840 python3[5269]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:08 np0005581840 sudo[5293]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noxfudnevdtmqkzlktahxrasgwtldzgo ; /usr/bin/python3'
Jan 12 13:04:08 np0005581840 sudo[5293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:08 np0005581840 python3[5295]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:08 np0005581840 sudo[5293]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:08 np0005581840 sudo[5371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xedmqdgyjneblxljtkydphjzuphehzol ; /usr/bin/python3'
Jan 12 13:04:08 np0005581840 sudo[5371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:08 np0005581840 python3[5373]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:04:08 np0005581840 sudo[5371]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:09 np0005581840 sudo[5444]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghmpsibxnjfvtixnhkdjsawlyltqfgmi ; /usr/bin/python3'
Jan 12 13:04:09 np0005581840 sudo[5444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:09 np0005581840 python3[5446]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1768223048.5591471-21-31788287913210/source follow=False _original_basename=mirror_info.sh.j2 checksum=8d04605e615eb785450b583fc5efd2437794600d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:09 np0005581840 sudo[5444]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:09 np0005581840 python3[5494]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:09 np0005581840 python3[5518]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:09 np0005581840 python3[5542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:10 np0005581840 python3[5566]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:10 np0005581840 python3[5590]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:10 np0005581840 python3[5614]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:10 np0005581840 python3[5638]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:11 np0005581840 python3[5662]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:11 np0005581840 python3[5686]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:11 np0005581840 python3[5710]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:11 np0005581840 python3[5734]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:11 np0005581840 python3[5758]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:11 np0005581840 python3[5782]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:12 np0005581840 python3[5806]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:12 np0005581840 python3[5830]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:12 np0005581840 python3[5854]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:12 np0005581840 python3[5878]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:12 np0005581840 python3[5902]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:13 np0005581840 python3[5926]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:13 np0005581840 python3[5950]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:13 np0005581840 python3[5974]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:13 np0005581840 python3[5998]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:13 np0005581840 python3[6022]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:14 np0005581840 python3[6046]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:14 np0005581840 python3[6070]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:14 np0005581840 python3[6094]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:04:16 np0005581840 sudo[6118]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrittywiuyrwbpfuaxsmgyxzahajcybq ; /usr/bin/python3'
Jan 12 13:04:16 np0005581840 sudo[6118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:16 np0005581840 python3[6120]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 12 13:04:16 np0005581840 systemd[1]: Starting Time & Date Service...
Jan 12 13:04:16 np0005581840 systemd[1]: Started Time & Date Service.
Jan 12 13:04:16 np0005581840 systemd-timedated[6122]: Changed time zone to 'UTC' (UTC).
Jan 12 13:04:17 np0005581840 sudo[6118]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:17 np0005581840 sudo[6149]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxhlhygbwccrnytopzmimkirvmpgesex ; /usr/bin/python3'
Jan 12 13:04:17 np0005581840 sudo[6149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:17 np0005581840 python3[6151]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:17 np0005581840 sudo[6149]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:17 np0005581840 python3[6227]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:04:17 np0005581840 python3[6298]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1768223057.3690553-153-33014531108600/source _original_basename=tmp_ywwvpke follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:18 np0005581840 python3[6398]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:04:18 np0005581840 python3[6469]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1768223057.9238303-183-234196862966325/source _original_basename=tmpgpzzizkk follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:18 np0005581840 sudo[6569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpzwuifiombxxsdokfbfslqadyfeevnc ; /usr/bin/python3'
Jan 12 13:04:18 np0005581840 sudo[6569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:18 np0005581840 python3[6571]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:04:18 np0005581840 sudo[6569]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:18 np0005581840 sudo[6642]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnlvokwarzkefbmgtfvjqcntifwwbuqe ; /usr/bin/python3'
Jan 12 13:04:18 np0005581840 sudo[6642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:19 np0005581840 python3[6644]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1768223058.655092-231-183137751024585/source _original_basename=tmpyxzdih4n follow=False checksum=2f3f767b39920175f9daa3b8699b7b0271bc9c3a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:19 np0005581840 sudo[6642]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:19 np0005581840 python3[6692]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:04:19 np0005581840 python3[6718]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:04:19 np0005581840 sudo[6796]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plohmkhqdaltegiigipbmzizviywpxsp ; /usr/bin/python3'
Jan 12 13:04:19 np0005581840 sudo[6796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:19 np0005581840 python3[6798]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:04:19 np0005581840 sudo[6796]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:20 np0005581840 sudo[6869]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzpryqgmygavfoxgwrgovpbakpfzxuwi ; /usr/bin/python3'
Jan 12 13:04:20 np0005581840 sudo[6869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:20 np0005581840 python3[6871]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1768223059.7951703-273-72310228161903/source _original_basename=tmp794r_cn5 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:20 np0005581840 sudo[6869]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:20 np0005581840 sudo[6920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onkivelteqxjahcasqjqhbragrislylv ; /usr/bin/python3'
Jan 12 13:04:20 np0005581840 sudo[6920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:20 np0005581840 python3[6922]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e1b-5cea-8a52-b6d3-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:04:20 np0005581840 sudo[6920]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:21 np0005581840 python3[6950]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                             _uses_shell=True zuul_log_id=fa163e1b-5cea-8a52-b6d3-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 12 13:04:22 np0005581840 python3[6979]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:28 np0005581840 irqbalance[771]: Cannot change IRQ 43 affinity: Operation not permitted
Jan 12 13:04:28 np0005581840 irqbalance[771]: IRQ 43 affinity is now unmanaged
Jan 12 13:04:36 np0005581840 sudo[7003]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtdvdurpuvqwjqqnuhthfulktteiekud ; /usr/bin/python3'
Jan 12 13:04:36 np0005581840 sudo[7003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:04:37 np0005581840 python3[7005]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:04:37 np0005581840 sudo[7003]: pam_unix(sudo:session): session closed for user root
Jan 12 13:04:47 np0005581840 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 12 13:04:59 np0005581840 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Jan 12 13:04:59 np0005581840 kernel: pci 0000:07:00.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 12 13:04:59 np0005581840 kernel: pci 0000:07:00.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 12 13:04:59 np0005581840 kernel: pci 0000:07:00.0: ROM [mem 0x00000000-0x0003ffff pref]
Jan 12 13:04:59 np0005581840 kernel: pci 0000:07:00.0: ROM [mem 0xfe000000-0xfe03ffff pref]: assigned
Jan 12 13:04:59 np0005581840 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfb600000-0xfb603fff 64bit pref]: assigned
Jan 12 13:04:59 np0005581840 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfe040000-0xfe040fff]: assigned
Jan 12 13:04:59 np0005581840 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002)
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5453] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 12 13:04:59 np0005581840 systemd-udevd[7008]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5556] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5577] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5581] device (eth1): carrier: link connected
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5583] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5589] policy: auto-activating connection 'Wired connection 1' (80b55774-fe11-3aa1-9b8b-d743e1ee863f)
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5592] device (eth1): Activation: starting connection 'Wired connection 1' (80b55774-fe11-3aa1-9b8b-d743e1ee863f)
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5593] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5596] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5600] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:04:59 np0005581840 NetworkManager[811]: <info>  [1768223099.5605] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:05:00 np0005581840 python3[7035]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e1b-5cea-f395-ae5f-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:05:06 np0005581840 sudo[7113]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loyehnazhftrykngxbfkczffidsxoozx ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Jan 12 13:05:06 np0005581840 sudo[7113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:05:06 np0005581840 python3[7115]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:05:06 np0005581840 sudo[7113]: pam_unix(sudo:session): session closed for user root
Jan 12 13:05:06 np0005581840 sudo[7186]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypixkvxowwvkjzgiuzmjqgjqxbsungln ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Jan 12 13:05:06 np0005581840 sudo[7186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:05:06 np0005581840 python3[7188]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768223106.525974-111-115569903636462/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=109bfc8463900148589f7704d0d92d3bbe94de59 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:05:06 np0005581840 sudo[7186]: pam_unix(sudo:session): session closed for user root
Jan 12 13:05:07 np0005581840 sudo[7236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uocgxziecwtudgpekwdagjmgslarstuu ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Jan 12 13:05:07 np0005581840 sudo[7236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:05:07 np0005581840 python3[7238]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:05:07 np0005581840 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 12 13:05:07 np0005581840 systemd[1]: Stopped Network Manager Wait Online.
Jan 12 13:05:07 np0005581840 systemd[1]: Stopping Network Manager Wait Online...
Jan 12 13:05:07 np0005581840 systemd[1]: Stopping Network Manager...
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.4955] caught SIGTERM, shutting down normally.
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.4962] dhcp4 (eth0): canceled DHCP transaction
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.4962] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.4962] dhcp4 (eth0): state changed no lease
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.4963] dhcp6 (eth0): canceled DHCP transaction
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.4964] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.4964] dhcp6 (eth0): state changed no lease
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.4966] manager: NetworkManager state is now CONNECTING
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.5069] dhcp4 (eth1): canceled DHCP transaction
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.5069] dhcp4 (eth1): state changed no lease
Jan 12 13:05:07 np0005581840 NetworkManager[811]: <info>  [1768223107.5096] exiting (success)
Jan 12 13:05:07 np0005581840 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 12 13:05:07 np0005581840 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 12 13:05:07 np0005581840 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 12 13:05:07 np0005581840 systemd[1]: Stopped Network Manager.
Jan 12 13:05:07 np0005581840 systemd[1]: Starting Network Manager...
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.5559] NetworkManager (version 1.54.2-1.el9) is starting... (after a restart, boot:94c17a7f-65c5-449e-af1b-e34d5bf3c7ea)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.5560] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.5597] manager[0x55e0ad64b000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 12 13:05:07 np0005581840 systemd[1]: Starting Hostname Service...
Jan 12 13:05:07 np0005581840 systemd[1]: Started Hostname Service.
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6109] hostname: hostname: using hostnamed
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6110] hostname: static hostname changed from (none) to "np0005581840"
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6112] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6115] manager[0x55e0ad64b000]: rfkill: Wi-Fi hardware radio set enabled
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6115] manager[0x55e0ad64b000]: rfkill: WWAN hardware radio set enabled
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6133] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6133] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6133] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6134] manager: Networking is enabled by state file
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6135] settings: Loaded settings plugin: keyfile (internal)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6143] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6166] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6172] dhcp: init: Using DHCP client 'internal'
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6174] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6177] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6181] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6189] device (lo): Activation: starting connection 'lo' (32e0de59-2ecd-409f-9e28-312d4eb0815d)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6195] device (eth0): carrier: link connected
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6198] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6201] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6202] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6207] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6211] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6220] device (eth1): carrier: link connected
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6224] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6227] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (80b55774-fe11-3aa1-9b8b-d743e1ee863f) (indicated)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6227] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6234] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6239] device (eth1): Activation: starting connection 'Wired connection 1' (80b55774-fe11-3aa1-9b8b-d743e1ee863f)
Jan 12 13:05:07 np0005581840 systemd[1]: Started Network Manager.
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6243] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6246] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6247] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6249] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6250] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6251] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6252] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6253] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6255] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6258] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6260] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6262] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6263] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6267] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6270] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6283] dhcp4 (eth0): state changed new lease, address=192.168.25.114
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6287] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 12 13:05:07 np0005581840 systemd[1]: Starting Network Manager Wait Online...
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6314] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6315] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 12 13:05:07 np0005581840 NetworkManager[7251]: <info>  [1768223107.6318] device (lo): Activation: successful, device activated.
Jan 12 13:05:07 np0005581840 sudo[7236]: pam_unix(sudo:session): session closed for user root
Jan 12 13:05:07 np0005581840 python3[7310]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e1b-5cea-f395-ae5f-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:05:08 np0005581840 NetworkManager[7251]: <info>  [1768223108.6719] dhcp6 (eth0): state changed new lease, address=2001:db8::10a
Jan 12 13:05:08 np0005581840 NetworkManager[7251]: <info>  [1768223108.6727] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 12 13:05:08 np0005581840 NetworkManager[7251]: <info>  [1768223108.6750] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 12 13:05:08 np0005581840 NetworkManager[7251]: <info>  [1768223108.6752] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 12 13:05:08 np0005581840 NetworkManager[7251]: <info>  [1768223108.6755] manager: NetworkManager state is now CONNECTED_SITE
Jan 12 13:05:08 np0005581840 NetworkManager[7251]: <info>  [1768223108.6757] device (eth0): Activation: successful, device activated.
Jan 12 13:05:08 np0005581840 NetworkManager[7251]: <info>  [1768223108.6761] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 12 13:05:18 np0005581840 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 12 13:05:37 np0005581840 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 12 13:05:42 np0005581840 sudo[7409]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzftoyxdfzwxyixvqnhxwudzvyuxgsrh ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Jan 12 13:05:42 np0005581840 sudo[7409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:05:42 np0005581840 python3[7411]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:05:42 np0005581840 sudo[7409]: pam_unix(sudo:session): session closed for user root
Jan 12 13:05:43 np0005581840 sudo[7482]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzausndkqlykrygnyngjplkzkjtyevvk ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Jan 12 13:05:43 np0005581840 sudo[7482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:05:43 np0005581840 python3[7484]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768223142.7882974-265-250927737143006/source _original_basename=tmp34y56vje follow=False checksum=3c0905e98eb602f209052066da22423b6549571d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:05:43 np0005581840 sudo[7482]: pam_unix(sudo:session): session closed for user root
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9425] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 12 13:05:52 np0005581840 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 12 13:05:52 np0005581840 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9659] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9660] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9662] device (eth1): Activation: successful, device activated.
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9665] manager: startup complete
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9666] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <warn>  [1768223152.9668] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9672] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 12 13:05:52 np0005581840 systemd[1]: Finished Network Manager Wait Online.
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9739] dhcp4 (eth1): canceled DHCP transaction
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9739] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9739] dhcp4 (eth1): state changed no lease
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9747] policy: auto-activating connection 'ci-private-network' (cd22dd6a-6b6d-5d12-97bb-b6ad4ed87c99)
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9750] device (eth1): Activation: starting connection 'ci-private-network' (cd22dd6a-6b6d-5d12-97bb-b6ad4ed87c99)
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9750] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9752] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9756] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9761] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9781] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9782] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:05:52 np0005581840 NetworkManager[7251]: <info>  [1768223152.9786] device (eth1): Activation: successful, device activated.
Jan 12 13:06:03 np0005581840 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 12 13:06:27 np0005581840 systemd[4373]: Starting Mark boot as successful...
Jan 12 13:06:27 np0005581840 systemd[4373]: Finished Mark boot as successful.
Jan 12 13:06:43 np0005581840 sshd-session[4382]: Received disconnect from 192.168.25.12 port 57328:11: disconnected by user
Jan 12 13:06:43 np0005581840 sshd-session[4382]: Disconnected from user zuul 192.168.25.12 port 57328
Jan 12 13:06:43 np0005581840 sshd-session[4369]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:06:43 np0005581840 systemd-logind[775]: Session 1 logged out. Waiting for processes to exit.
Jan 12 13:09:27 np0005581840 systemd[4373]: Created slice User Background Tasks Slice.
Jan 12 13:09:27 np0005581840 systemd[4373]: Starting Cleanup of User's Temporary Files and Directories...
Jan 12 13:09:27 np0005581840 systemd[4373]: Finished Cleanup of User's Temporary Files and Directories.
Jan 12 13:10:53 np0005581840 sshd-session[7537]: Accepted publickey for zuul from 192.168.25.12 port 49222 ssh2: RSA SHA256:fI1ARQuzDFaG6ZRVjTrjERJMRfRFskzrjkxsBiWm2/0
Jan 12 13:10:53 np0005581840 systemd-logind[775]: New session 3 of user zuul.
Jan 12 13:10:53 np0005581840 systemd[1]: Started Session 3 of User zuul.
Jan 12 13:10:53 np0005581840 sshd-session[7537]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:10:53 np0005581840 sudo[7564]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cueodwggjftcgonjbtrkvaxppaxroeyf ; /usr/bin/python3'
Jan 12 13:10:53 np0005581840 sudo[7564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:53 np0005581840 python3[7566]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                             _uses_shell=True zuul_log_id=fa163e1b-5cea-1eaf-aa9d-000000002161-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:10:53 np0005581840 sudo[7564]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:53 np0005581840 sudo[7593]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xekiapyzazmungtmwgoeeubdtbmtqvtw ; /usr/bin/python3'
Jan 12 13:10:53 np0005581840 sudo[7593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:53 np0005581840 python3[7595]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:10:53 np0005581840 sudo[7593]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:53 np0005581840 sudo[7619]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxlbbqnabcjuldrisuhfzpryergkymxv ; /usr/bin/python3'
Jan 12 13:10:53 np0005581840 sudo[7619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:53 np0005581840 python3[7621]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:10:53 np0005581840 sudo[7619]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:54 np0005581840 sudo[7645]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baniejuplsukhzphvefxrqhldlaqhlnl ; /usr/bin/python3'
Jan 12 13:10:54 np0005581840 sudo[7645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:54 np0005581840 python3[7647]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:10:54 np0005581840 sudo[7645]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:54 np0005581840 sudo[7671]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsnwrnropqowiucafszepiuagocskkhm ; /usr/bin/python3'
Jan 12 13:10:54 np0005581840 sudo[7671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:54 np0005581840 python3[7673]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:10:54 np0005581840 sudo[7671]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:54 np0005581840 sudo[7697]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcvjzkuduuhrfjutmlbmvrgdfedfsjil ; /usr/bin/python3'
Jan 12 13:10:54 np0005581840 sudo[7697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:54 np0005581840 python3[7699]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:10:54 np0005581840 sudo[7697]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:54 np0005581840 sudo[7775]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrdhjnyvwazptwfubvezretgechzkmsa ; /usr/bin/python3'
Jan 12 13:10:54 np0005581840 sudo[7775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:55 np0005581840 python3[7777]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:10:55 np0005581840 sudo[7775]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:55 np0005581840 sudo[7848]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pheqovntjogbulkcxjduloekljsmkpmk ; /usr/bin/python3'
Jan 12 13:10:55 np0005581840 sudo[7848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:55 np0005581840 python3[7850]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768223454.9017992-493-165097189236020/source _original_basename=tmpsunjkicj follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:10:55 np0005581840 sudo[7848]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:55 np0005581840 sudo[7898]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyzbmevtjxvjwdshecjqxfowbzjsdgua ; /usr/bin/python3'
Jan 12 13:10:55 np0005581840 sudo[7898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:56 np0005581840 python3[7900]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:10:56 np0005581840 systemd[1]: Reloading.
Jan 12 13:10:56 np0005581840 systemd-rc-local-generator[7918]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:10:56 np0005581840 sudo[7898]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:57 np0005581840 sudo[7954]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cblfgojjlvzkdapyhpszympcporgjeuv ; /usr/bin/python3'
Jan 12 13:10:57 np0005581840 sudo[7954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:57 np0005581840 python3[7956]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 12 13:10:57 np0005581840 sudo[7954]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:57 np0005581840 sudo[7980]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmwawldwtupdovvaqyvwfvuxinkugqcn ; /usr/bin/python3'
Jan 12 13:10:57 np0005581840 sudo[7980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:57 np0005581840 python3[7982]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:10:57 np0005581840 sudo[7980]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:57 np0005581840 sudo[8008]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsawyvvlyxumjyomxpuapwnqrgqvwmms ; /usr/bin/python3'
Jan 12 13:10:57 np0005581840 sudo[8008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:57 np0005581840 python3[8010]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:10:57 np0005581840 sudo[8008]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:57 np0005581840 sudo[8036]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vspdxmindumbzptxgvfvfuldgqruoewh ; /usr/bin/python3'
Jan 12 13:10:57 np0005581840 sudo[8036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:57 np0005581840 python3[8038]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:10:57 np0005581840 sudo[8036]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:57 np0005581840 sudo[8064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxfhkazxczlrdlhwkcqxtwrpdhvamfrm ; /usr/bin/python3'
Jan 12 13:10:57 np0005581840 sudo[8064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:10:58 np0005581840 python3[8066]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:10:58 np0005581840 sudo[8064]: pam_unix(sudo:session): session closed for user root
Jan 12 13:10:58 np0005581840 python3[8093]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                             _uses_shell=True zuul_log_id=fa163e1b-5cea-1eaf-aa9d-000000002168-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:10:59 np0005581840 python3[8123]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 12 13:11:00 np0005581840 sshd-session[7540]: Connection closed by 192.168.25.12 port 49222
Jan 12 13:11:00 np0005581840 sshd-session[7537]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:11:00 np0005581840 systemd-logind[775]: Session 3 logged out. Waiting for processes to exit.
Jan 12 13:11:00 np0005581840 systemd[1]: session-3.scope: Deactivated successfully.
Jan 12 13:11:00 np0005581840 systemd[1]: session-3.scope: Consumed 2.878s CPU time.
Jan 12 13:11:00 np0005581840 systemd-logind[775]: Removed session 3.
Jan 12 13:11:02 np0005581840 sshd-session[8130]: Accepted publickey for zuul from 192.168.25.12 port 35976 ssh2: RSA SHA256:fI1ARQuzDFaG6ZRVjTrjERJMRfRFskzrjkxsBiWm2/0
Jan 12 13:11:02 np0005581840 systemd-logind[775]: New session 4 of user zuul.
Jan 12 13:11:02 np0005581840 systemd[1]: Started Session 4 of User zuul.
Jan 12 13:11:02 np0005581840 sshd-session[8130]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:11:02 np0005581840 sudo[8157]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjiqzqeiodvokmgzlvnihkjielvnbndv ; /usr/bin/python3'
Jan 12 13:11:02 np0005581840 sudo[8157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:11:02 np0005581840 python3[8159]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 12 13:11:19 np0005581840 setsebool[8198]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 12 13:11:19 np0005581840 setsebool[8198]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 12 13:11:28 np0005581840 kernel: SELinux:  Converting 384 SID table entries...
Jan 12 13:11:28 np0005581840 kernel: SELinux:  policy capability network_peer_controls=1
Jan 12 13:11:28 np0005581840 kernel: SELinux:  policy capability open_perms=1
Jan 12 13:11:28 np0005581840 kernel: SELinux:  policy capability extended_socket_class=1
Jan 12 13:11:28 np0005581840 kernel: SELinux:  policy capability always_check_network=0
Jan 12 13:11:28 np0005581840 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 12 13:11:28 np0005581840 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 12 13:11:28 np0005581840 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 12 13:11:34 np0005581840 kernel: SELinux:  Converting 387 SID table entries...
Jan 12 13:11:34 np0005581840 kernel: SELinux:  policy capability network_peer_controls=1
Jan 12 13:11:34 np0005581840 kernel: SELinux:  policy capability open_perms=1
Jan 12 13:11:34 np0005581840 kernel: SELinux:  policy capability extended_socket_class=1
Jan 12 13:11:34 np0005581840 kernel: SELinux:  policy capability always_check_network=0
Jan 12 13:11:34 np0005581840 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 12 13:11:34 np0005581840 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 12 13:11:34 np0005581840 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 12 13:11:46 np0005581840 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 12 13:11:46 np0005581840 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 12 13:11:46 np0005581840 systemd[1]: Starting man-db-cache-update.service...
Jan 12 13:11:46 np0005581840 systemd[1]: Reloading.
Jan 12 13:11:46 np0005581840 systemd-rc-local-generator[8960]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:11:46 np0005581840 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 12 13:11:47 np0005581840 sudo[8157]: pam_unix(sudo:session): session closed for user root
Jan 12 13:11:50 np0005581840 python3[14028]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                              _uses_shell=True zuul_log_id=fa163e1b-5cea-4c82-0777-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:11:51 np0005581840 kernel: evm: overlay not supported
Jan 12 13:11:51 np0005581840 systemd[4373]: Starting D-Bus User Message Bus...
Jan 12 13:11:51 np0005581840 dbus-broker-launch[14758]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 12 13:11:51 np0005581840 dbus-broker-launch[14758]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 12 13:11:51 np0005581840 systemd[4373]: Started D-Bus User Message Bus.
Jan 12 13:11:51 np0005581840 dbus-broker-lau[14758]: Ready
Jan 12 13:11:51 np0005581840 systemd[4373]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 12 13:11:51 np0005581840 systemd[4373]: Created slice Slice /user.
Jan 12 13:11:51 np0005581840 systemd[4373]: podman-14688.scope: unit configures an IP firewall, but not running as root.
Jan 12 13:11:51 np0005581840 systemd[4373]: (This warning is only shown for the first unit using IP firewalling.)
Jan 12 13:11:51 np0005581840 systemd[4373]: Started podman-14688.scope.
Jan 12 13:11:52 np0005581840 systemd[4373]: Started podman-pause-f4db445d.scope.
Jan 12 13:11:52 np0005581840 sshd-session[8133]: Connection closed by 192.168.25.12 port 35976
Jan 12 13:11:52 np0005581840 sshd-session[8130]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:11:52 np0005581840 systemd[1]: session-4.scope: Deactivated successfully.
Jan 12 13:11:52 np0005581840 systemd[1]: session-4.scope: Consumed 30.134s CPU time.
Jan 12 13:11:52 np0005581840 systemd-logind[775]: Session 4 logged out. Waiting for processes to exit.
Jan 12 13:11:52 np0005581840 systemd-logind[775]: Removed session 4.
Jan 12 13:12:05 np0005581840 sshd-session[25831]: Connection closed by 192.168.25.49 port 38580 [preauth]
Jan 12 13:12:05 np0005581840 sshd-session[25833]: Connection closed by 192.168.25.49 port 38592 [preauth]
Jan 12 13:12:05 np0005581840 sshd-session[25834]: Unable to negotiate with 192.168.25.49 port 38600: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 12 13:12:05 np0005581840 sshd-session[25838]: Unable to negotiate with 192.168.25.49 port 38608: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 12 13:12:05 np0005581840 sshd-session[25839]: Unable to negotiate with 192.168.25.49 port 38624: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 12 13:12:10 np0005581840 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 12 13:12:10 np0005581840 systemd[1]: Finished man-db-cache-update.service.
Jan 12 13:12:10 np0005581840 systemd[1]: man-db-cache-update.service: Consumed 28.698s CPU time.
Jan 12 13:12:10 np0005581840 systemd[1]: run-r37177d0c58184b909985ecbbd79ee75c.service: Deactivated successfully.
Jan 12 13:12:14 np0005581840 sshd-session[29583]: Accepted publickey for zuul from 192.168.25.12 port 57118 ssh2: RSA SHA256:fI1ARQuzDFaG6ZRVjTrjERJMRfRFskzrjkxsBiWm2/0
Jan 12 13:12:14 np0005581840 systemd-logind[775]: New session 5 of user zuul.
Jan 12 13:12:14 np0005581840 systemd[1]: Started Session 5 of User zuul.
Jan 12 13:12:14 np0005581840 sshd-session[29583]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:12:14 np0005581840 python3[29610]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAkR8QpJP9Cno5hjBTLe2MsNw4kLzPKlUw7GGX5poWKsmlAjc16sQjuWP2vS3e1N+YxhKzoEC/qLaqhw9TJxKkE= zuul@np0005581839
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:12:15 np0005581840 sudo[29634]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlivtgbxunfdxsxaiydssteplivpajsp ; /usr/bin/python3'
Jan 12 13:12:15 np0005581840 sudo[29634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:12:15 np0005581840 python3[29636]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAkR8QpJP9Cno5hjBTLe2MsNw4kLzPKlUw7GGX5poWKsmlAjc16sQjuWP2vS3e1N+YxhKzoEC/qLaqhw9TJxKkE= zuul@np0005581839
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:12:15 np0005581840 sudo[29634]: pam_unix(sudo:session): session closed for user root
Jan 12 13:12:15 np0005581840 sudo[29660]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xainvqjxxgmayvcnxrdokbjmdrucpvvq ; /usr/bin/python3'
Jan 12 13:12:15 np0005581840 sudo[29660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:12:15 np0005581840 python3[29662]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005581840 update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 12 13:12:15 np0005581840 useradd[29664]: new group: name=cloud-admin, GID=1002
Jan 12 13:12:15 np0005581840 useradd[29664]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 12 13:12:15 np0005581840 sudo[29660]: pam_unix(sudo:session): session closed for user root
Jan 12 13:12:16 np0005581840 sudo[29694]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stuxmxclcztjelfqqvzcyenhtnxfbfiy ; /usr/bin/python3'
Jan 12 13:12:16 np0005581840 sudo[29694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:12:16 np0005581840 python3[29696]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAkR8QpJP9Cno5hjBTLe2MsNw4kLzPKlUw7GGX5poWKsmlAjc16sQjuWP2vS3e1N+YxhKzoEC/qLaqhw9TJxKkE= zuul@np0005581839
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 12 13:12:16 np0005581840 sudo[29694]: pam_unix(sudo:session): session closed for user root
Jan 12 13:12:16 np0005581840 sudo[29772]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odthklnpxbhqlqpbufdgvxuxsdndrffl ; /usr/bin/python3'
Jan 12 13:12:16 np0005581840 sudo[29772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:12:16 np0005581840 python3[29774]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:12:16 np0005581840 sudo[29772]: pam_unix(sudo:session): session closed for user root
Jan 12 13:12:16 np0005581840 sudo[29845]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbngitscevnnggnjclwgxwqbsftbhohw ; /usr/bin/python3'
Jan 12 13:12:16 np0005581840 sudo[29845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:12:16 np0005581840 python3[29847]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1768223536.275503-120-55833920992759/source _original_basename=tmp3b3a34jo follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:12:16 np0005581840 sudo[29845]: pam_unix(sudo:session): session closed for user root
Jan 12 13:12:17 np0005581840 sudo[29895]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpphbgroylnuojssegmfpncsrztdmruk ; /usr/bin/python3'
Jan 12 13:12:17 np0005581840 sudo[29895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:12:17 np0005581840 python3[29897]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 12 13:12:17 np0005581840 systemd[1]: Starting Hostname Service...
Jan 12 13:12:17 np0005581840 systemd[1]: Started Hostname Service.
Jan 12 13:12:17 np0005581840 systemd-hostnamed[29901]: Changed pretty hostname to 'compute-0'
Jan 12 13:12:17 compute-0 systemd-hostnamed[29901]: Hostname set to <compute-0> (static)
Jan 12 13:12:17 compute-0 NetworkManager[7251]: <info>  [1768223537.4768] hostname: static hostname changed from "np0005581840" to "compute-0"
Jan 12 13:12:17 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 12 13:12:17 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 12 13:12:17 compute-0 sudo[29895]: pam_unix(sudo:session): session closed for user root
Jan 12 13:12:17 compute-0 sshd-session[29586]: Connection closed by 192.168.25.12 port 57118
Jan 12 13:12:17 compute-0 sshd-session[29583]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:12:17 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Jan 12 13:12:17 compute-0 systemd[1]: session-5.scope: Consumed 1.608s CPU time.
Jan 12 13:12:17 compute-0 systemd-logind[775]: Session 5 logged out. Waiting for processes to exit.
Jan 12 13:12:17 compute-0 systemd-logind[775]: Removed session 5.
Jan 12 13:12:27 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 12 13:12:47 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 12 13:15:31 compute-0 sshd-session[29922]: Accepted publickey for zuul from 192.168.25.49 port 57630 ssh2: RSA SHA256:fI1ARQuzDFaG6ZRVjTrjERJMRfRFskzrjkxsBiWm2/0
Jan 12 13:15:31 compute-0 systemd-logind[775]: New session 6 of user zuul.
Jan 12 13:15:31 compute-0 systemd[1]: Started Session 6 of User zuul.
Jan 12 13:15:31 compute-0 sshd-session[29922]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:15:31 compute-0 python3[29998]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:15:32 compute-0 sudo[30108]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxfstiixxewiigwmflakrugkigconzcv ; /usr/bin/python3'
Jan 12 13:15:32 compute-0 sudo[30108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:32 compute-0 python3[30110]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:15:33 compute-0 sudo[30108]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:33 compute-0 sudo[30181]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uehlfkksiftqylaibzizvpttoverdkhx ; /usr/bin/python3'
Jan 12 13:15:33 compute-0 sudo[30181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:33 compute-0 python3[30183]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768223732.7771347-34001-188301131415666/source mode=0755 _original_basename=delorean.repo follow=False checksum=619eee7d4b000c2fdbd89639e9af5cd9cd1e4284 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:15:33 compute-0 sudo[30181]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:33 compute-0 sudo[30207]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vulurpjxcotdiwphgwgdydymtemrtlfr ; /usr/bin/python3'
Jan 12 13:15:33 compute-0 sudo[30207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:33 compute-0 python3[30209]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:15:33 compute-0 sudo[30207]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:33 compute-0 sudo[30280]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppfjrvrgkqvvyzlyyufifurbbgpvmjar ; /usr/bin/python3'
Jan 12 13:15:33 compute-0 sudo[30280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:33 compute-0 python3[30282]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768223732.7771347-34001-188301131415666/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=32cab4d7d3069e03e1e375a1684f22cb2eb72603 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:15:33 compute-0 sudo[30280]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:33 compute-0 sudo[30306]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofukvsfplbglrqwzccewichhrmbpsvak ; /usr/bin/python3'
Jan 12 13:15:33 compute-0 sudo[30306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:33 compute-0 python3[30308]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:15:33 compute-0 sudo[30306]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:34 compute-0 sudo[30379]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zscwwravuridfljpsnvfchzuzuysagqb ; /usr/bin/python3'
Jan 12 13:15:34 compute-0 sudo[30379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:34 compute-0 python3[30381]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768223732.7771347-34001-188301131415666/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=5c739387d960f7119f9d22475c90dcd56f13e885 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:15:34 compute-0 sudo[30379]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:34 compute-0 sudo[30405]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mavkekchuqautidlnqgtovprppbfxcmt ; /usr/bin/python3'
Jan 12 13:15:34 compute-0 sudo[30405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:34 compute-0 python3[30407]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:15:34 compute-0 sudo[30405]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:34 compute-0 sudo[30478]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ancfvmwkpsrkkxjnjoqpupwaaaxlyhte ; /usr/bin/python3'
Jan 12 13:15:34 compute-0 sudo[30478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:34 compute-0 python3[30480]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768223732.7771347-34001-188301131415666/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=8c00581855ef07972e002c82cc33b7b03ecccc44 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:15:34 compute-0 sudo[30478]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:34 compute-0 sudo[30504]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnevtflhfizuxbuulwfwrplxyblylzfd ; /usr/bin/python3'
Jan 12 13:15:34 compute-0 sudo[30504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:34 compute-0 python3[30506]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:15:34 compute-0 sudo[30504]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:34 compute-0 sudo[30577]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fafiawfazsvkmuzizniykuzifcmwurqj ; /usr/bin/python3'
Jan 12 13:15:34 compute-0 sudo[30577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:34 compute-0 python3[30579]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768223732.7771347-34001-188301131415666/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=5515871802d2268513e691cf460c59c7da7132f9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:15:35 compute-0 sudo[30577]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:35 compute-0 sudo[30603]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czmjahxmlqrhcoexmcjdnrzmzzfawzdn ; /usr/bin/python3'
Jan 12 13:15:35 compute-0 sudo[30603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:35 compute-0 python3[30605]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:15:35 compute-0 sudo[30603]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:35 compute-0 sudo[30676]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrwsyntrlcbzswdbrugrslggyizllkui ; /usr/bin/python3'
Jan 12 13:15:35 compute-0 sudo[30676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:35 compute-0 python3[30678]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768223732.7771347-34001-188301131415666/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=c87c0371a768c46886c8904021e8b85df789a625 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:15:35 compute-0 sudo[30676]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:35 compute-0 sudo[30702]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcabtqkldepyoswyjuxkccwtiugxcsnd ; /usr/bin/python3'
Jan 12 13:15:35 compute-0 sudo[30702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:35 compute-0 python3[30704]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 12 13:15:35 compute-0 sudo[30702]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:35 compute-0 sudo[30775]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cixfruaxjaqyvigoehqlruvtiylegdee ; /usr/bin/python3'
Jan 12 13:15:35 compute-0 sudo[30775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:15:35 compute-0 python3[30777]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1768223732.7771347-34001-188301131415666/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:15:35 compute-0 sudo[30775]: pam_unix(sudo:session): session closed for user root
Jan 12 13:15:37 compute-0 sshd-session[30802]: Connection closed by 192.168.122.11 port 52394 [preauth]
Jan 12 13:15:37 compute-0 sshd-session[30803]: Connection closed by 192.168.122.11 port 52408 [preauth]
Jan 12 13:15:37 compute-0 sshd-session[30804]: Unable to negotiate with 192.168.122.11 port 52420: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 12 13:15:37 compute-0 sshd-session[30805]: Unable to negotiate with 192.168.122.11 port 52430: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 12 13:15:37 compute-0 sshd-session[30806]: Unable to negotiate with 192.168.122.11 port 52440: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 12 13:15:44 compute-0 python3[30835]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:17:27 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 12 13:17:27 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 12 13:17:27 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 12 13:17:27 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 12 13:20:44 compute-0 sshd-session[29925]: Received disconnect from 192.168.25.49 port 57630:11: disconnected by user
Jan 12 13:20:44 compute-0 sshd-session[29925]: Disconnected from user zuul 192.168.25.49 port 57630
Jan 12 13:20:44 compute-0 sshd-session[29922]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:20:44 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Jan 12 13:20:44 compute-0 systemd[1]: session-6.scope: Consumed 3.431s CPU time.
Jan 12 13:20:44 compute-0 systemd-logind[775]: Session 6 logged out. Waiting for processes to exit.
Jan 12 13:20:44 compute-0 systemd-logind[775]: Removed session 6.
Jan 12 13:25:55 compute-0 sshd-session[30843]: Accepted publickey for zuul from 192.168.122.30 port 35320 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:25:55 compute-0 systemd-logind[775]: New session 7 of user zuul.
Jan 12 13:25:55 compute-0 systemd[1]: Started Session 7 of User zuul.
Jan 12 13:25:55 compute-0 sshd-session[30843]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:25:56 compute-0 python3.9[30996]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:25:57 compute-0 sudo[31175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdapaatcrlukmmypcerieqjhknpmzebj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224356.8953066-27-148492640714585/AnsiballZ_command.py'
Jan 12 13:25:57 compute-0 sudo[31175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:25:57 compute-0 python3.9[31177]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:26:05 compute-0 sudo[31175]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:05 compute-0 sshd-session[30846]: Connection closed by 192.168.122.30 port 35320
Jan 12 13:26:05 compute-0 sshd-session[30843]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:26:05 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Jan 12 13:26:05 compute-0 systemd[1]: session-7.scope: Consumed 5.990s CPU time.
Jan 12 13:26:05 compute-0 systemd-logind[775]: Session 7 logged out. Waiting for processes to exit.
Jan 12 13:26:05 compute-0 systemd-logind[775]: Removed session 7.
Jan 12 13:26:10 compute-0 sshd-session[31235]: Accepted publickey for zuul from 192.168.122.30 port 42560 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:26:10 compute-0 systemd-logind[775]: New session 8 of user zuul.
Jan 12 13:26:10 compute-0 systemd[1]: Started Session 8 of User zuul.
Jan 12 13:26:10 compute-0 sshd-session[31235]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:26:11 compute-0 python3.9[31388]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:26:11 compute-0 sshd-session[31238]: Connection closed by 192.168.122.30 port 42560
Jan 12 13:26:11 compute-0 sshd-session[31235]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:26:11 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Jan 12 13:26:11 compute-0 systemd-logind[775]: Session 8 logged out. Waiting for processes to exit.
Jan 12 13:26:11 compute-0 systemd-logind[775]: Removed session 8.
Jan 12 13:26:27 compute-0 sshd-session[31417]: Accepted publickey for zuul from 192.168.122.30 port 36284 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:26:27 compute-0 systemd-logind[775]: New session 9 of user zuul.
Jan 12 13:26:27 compute-0 systemd[1]: Started Session 9 of User zuul.
Jan 12 13:26:27 compute-0 sshd-session[31417]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:26:27 compute-0 python3.9[31570]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 12 13:26:28 compute-0 python3.9[31744]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:26:28 compute-0 sudo[31894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlfeljdrvnafrfaoyltrtyffzpwxqrhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224388.6576753-40-274337131513962/AnsiballZ_command.py'
Jan 12 13:26:28 compute-0 sudo[31894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:29 compute-0 python3.9[31896]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:26:29 compute-0 sudo[31894]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:29 compute-0 sudo[32047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mblllgjbqocwxelnokxouvimgfbgwdro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224389.2715483-52-227040728971311/AnsiballZ_stat.py'
Jan 12 13:26:29 compute-0 sudo[32047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:29 compute-0 python3.9[32049]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:26:29 compute-0 sudo[32047]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:30 compute-0 sudo[32199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqnmkzdqfvibriontzavsgfyfbgxbahk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224389.7866895-60-133390637044681/AnsiballZ_file.py'
Jan 12 13:26:30 compute-0 sudo[32199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:30 compute-0 python3.9[32201]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:26:30 compute-0 sudo[32199]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:30 compute-0 sudo[32351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzejzwqcoypftovytrsmsjfmwecqidqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224390.322925-68-249816759043412/AnsiballZ_stat.py'
Jan 12 13:26:30 compute-0 sudo[32351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:30 compute-0 python3.9[32353]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:26:30 compute-0 sudo[32351]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:30 compute-0 sudo[32474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkbvqzbnjdrovbfjwkycbglumgnndtao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224390.322925-68-249816759043412/AnsiballZ_copy.py'
Jan 12 13:26:30 compute-0 sudo[32474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:31 compute-0 python3.9[32476]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224390.322925-68-249816759043412/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:26:31 compute-0 sudo[32474]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:31 compute-0 sudo[32626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trxuxorqwskqexafqgkvhmwsjdzyclvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224391.226148-83-112088758525931/AnsiballZ_setup.py'
Jan 12 13:26:31 compute-0 sudo[32626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:31 compute-0 python3.9[32628]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:26:31 compute-0 sudo[32626]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:32 compute-0 sudo[32782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jezcuqcbtjltbtreybbhrggpsfcgxfus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224391.8946493-91-5988965594814/AnsiballZ_file.py'
Jan 12 13:26:32 compute-0 sudo[32782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:32 compute-0 python3.9[32784]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:26:32 compute-0 sudo[32782]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:32 compute-0 sudo[32934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrnipmbwbfoocphkfkcvnofuwmtsdvai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224392.3601549-100-218504814068809/AnsiballZ_file.py'
Jan 12 13:26:32 compute-0 sudo[32934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:32 compute-0 python3.9[32936]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:26:32 compute-0 sudo[32934]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:33 compute-0 python3.9[33086]: ansible-ansible.builtin.service_facts Invoked
Jan 12 13:26:35 compute-0 python3.9[33339]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:26:36 compute-0 python3.9[33489]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:26:37 compute-0 python3.9[33643]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:26:37 compute-0 sudo[33799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwvurrgcmmxpnwvbazdkvhcihpbyujxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224397.391084-148-144106406383661/AnsiballZ_setup.py'
Jan 12 13:26:37 compute-0 sudo[33799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:37 compute-0 python3.9[33801]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:26:38 compute-0 sudo[33799]: pam_unix(sudo:session): session closed for user root
Jan 12 13:26:38 compute-0 sudo[33883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vancdnloktpyqxpmxvhtnmpjzlbmynjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224397.391084-148-144106406383661/AnsiballZ_dnf.py'
Jan 12 13:26:38 compute-0 sudo[33883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:26:38 compute-0 python3.9[33885]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:27:54 compute-0 systemd[1]: Reloading.
Jan 12 13:27:54 compute-0 systemd-rc-local-generator[34085]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:27:54 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 12 13:27:54 compute-0 systemd[1]: Reloading.
Jan 12 13:27:54 compute-0 systemd-rc-local-generator[34126]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:27:55 compute-0 systemd[1]: Starting dnf makecache...
Jan 12 13:27:55 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 12 13:27:55 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 12 13:27:55 compute-0 systemd[1]: Reloading.
Jan 12 13:27:55 compute-0 systemd-rc-local-generator[34164]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:27:55 compute-0 dnf[34139]: Failed determining last makecache time.
Jan 12 13:27:55 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 12 13:27:55 compute-0 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Jan 12 13:27:55 compute-0 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Jan 12 13:27:55 compute-0 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Jan 12 13:27:55 compute-0 dnf[34139]: delorean-openstack-barbican-42b4c41831408a8e323  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:55 compute-0 dnf[34139]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:55 compute-0 dnf[34139]: delorean-openstack-cinder-1c00d6490d88e436f26ef  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:55 compute-0 dnf[34139]: delorean-python-stevedore-c4acc5639fd2329372142  20 kB/s | 3.0 kB     00:00
Jan 12 13:27:55 compute-0 dnf[34139]: delorean-python-cloudkitty-tests-tempest-2c80f8  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:56 compute-0 dnf[34139]: delorean-os-refresh-config-9bfc52b5049be2d8de61  22 kB/s | 3.0 kB     00:00
Jan 12 13:27:56 compute-0 dnf[34139]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:56 compute-0 dnf[34139]: delorean-python-designate-tests-tempest-347fdbc  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:56 compute-0 dnf[34139]: delorean-openstack-glance-1fd12c29b339f30fe823e  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:56 compute-0 dnf[34139]: delorean-openstack-keystone-e4b40af0ae3698fbbbb  20 kB/s | 3.0 kB     00:00
Jan 12 13:27:56 compute-0 dnf[34139]: delorean-openstack-manila-3c01b7181572c95dac462  20 kB/s | 3.0 kB     00:00
Jan 12 13:27:56 compute-0 dnf[34139]: delorean-python-whitebox-neutron-tests-tempest-  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:57 compute-0 dnf[34139]: delorean-openstack-octavia-ba397f07a7331190208c  23 kB/s | 3.0 kB     00:00
Jan 12 13:27:57 compute-0 dnf[34139]: delorean-openstack-watcher-c014f81a8647287f6dcc  20 kB/s | 3.0 kB     00:00
Jan 12 13:27:57 compute-0 dnf[34139]: delorean-ansible-config_template-5ccaa22121a7ff  22 kB/s | 3.0 kB     00:00
Jan 12 13:27:57 compute-0 dnf[34139]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:57 compute-0 dnf[34139]: delorean-openstack-swift-dc98a8463506ac520c469a  20 kB/s | 3.0 kB     00:00
Jan 12 13:27:57 compute-0 dnf[34139]: delorean-python-tempestconf-8515371b7cceebd4282  20 kB/s | 3.0 kB     00:00
Jan 12 13:27:57 compute-0 dnf[34139]: delorean-openstack-heat-ui-013accbfd179753bc3f0  21 kB/s | 3.0 kB     00:00
Jan 12 13:27:59 compute-0 dnf[34139]: CentOS Stream 9 - BaseOS                        4.6 kB/s | 6.7 kB     00:01
Jan 12 13:28:00 compute-0 dnf[34139]: CentOS Stream 9 - AppStream                     8.8 kB/s | 6.8 kB     00:00
Jan 12 13:28:01 compute-0 dnf[34139]: CentOS Stream 9 - CRB                            10 kB/s | 6.6 kB     00:00
Jan 12 13:28:03 compute-0 dnf[34139]: CentOS Stream 9 - Extras packages               3.2 kB/s | 7.3 kB     00:02
Jan 12 13:28:03 compute-0 dnf[34139]: dlrn-antelope-testing                            23 kB/s | 3.0 kB     00:00
Jan 12 13:28:03 compute-0 dnf[34139]: dlrn-antelope-build-deps                         21 kB/s | 3.0 kB     00:00
Jan 12 13:28:05 compute-0 dnf[34139]: centos9-rabbitmq                                2.1 kB/s | 3.0 kB     00:01
Jan 12 13:28:06 compute-0 dnf[34139]: centos9-storage                                 2.5 kB/s | 3.0 kB     00:01
Jan 12 13:28:06 compute-0 dnf[34139]: centos9-opstools                                7.0 kB/s | 3.0 kB     00:00
Jan 12 13:28:07 compute-0 dnf[34139]: NFV SIG OpenvSwitch                             6.6 kB/s | 3.0 kB     00:00
Jan 12 13:28:07 compute-0 dnf[34139]: repo-setup-centos-appstream                     9.3 kB/s | 4.4 kB     00:00
Jan 12 13:28:08 compute-0 dnf[34139]: repo-setup-centos-baseos                        9.0 kB/s | 3.9 kB     00:00
Jan 12 13:28:08 compute-0 dnf[34139]: repo-setup-centos-highavailability              9.1 kB/s | 3.9 kB     00:00
Jan 12 13:28:11 compute-0 dnf[34139]: repo-setup-centos-powertools                    1.6 kB/s | 4.3 kB     00:02
Jan 12 13:28:12 compute-0 dnf[34139]: Extra Packages for Enterprise Linux 9 - x86_64   22 kB/s |  32 kB     00:01
Jan 12 13:28:13 compute-0 dnf[34139]: Metadata cache created.
Jan 12 13:28:13 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 12 13:28:13 compute-0 systemd[1]: Finished dnf makecache.
Jan 12 13:28:13 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.356s CPU time.
Jan 12 13:28:43 compute-0 kernel: SELinux:  Converting 2717 SID table entries...
Jan 12 13:28:43 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 12 13:28:43 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 12 13:28:43 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 12 13:28:43 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 12 13:28:43 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 12 13:28:43 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 12 13:28:43 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 12 13:28:43 compute-0 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 12 13:28:43 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 12 13:28:43 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 12 13:28:43 compute-0 systemd[1]: Reloading.
Jan 12 13:28:43 compute-0 systemd-rc-local-generator[34507]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:28:44 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 12 13:28:44 compute-0 sudo[33883]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:44 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 12 13:28:44 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 12 13:28:44 compute-0 systemd[1]: run-re2160cb9bcb446e6839d09dde12a1367.service: Deactivated successfully.
Jan 12 13:28:44 compute-0 sudo[35424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsbljsxvhnkhpnqqknvsglmjsuspjpwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224524.4542139-160-30253519948174/AnsiballZ_command.py'
Jan 12 13:28:44 compute-0 sudo[35424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:44 compute-0 python3.9[35426]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:28:45 compute-0 sudo[35424]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:46 compute-0 sudo[35705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqzryfqllzzxawbyxstfdxzfasfqwgcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224525.5598075-168-194204652240390/AnsiballZ_selinux.py'
Jan 12 13:28:46 compute-0 sudo[35705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:46 compute-0 python3.9[35707]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 12 13:28:46 compute-0 sudo[35705]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:46 compute-0 sudo[35857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uetatsmzmmpktercfrsjvlqlyddwpsiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224526.486743-179-193051706911541/AnsiballZ_command.py'
Jan 12 13:28:46 compute-0 sudo[35857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:46 compute-0 python3.9[35859]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 12 13:28:47 compute-0 sudo[35857]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:47 compute-0 sudo[36010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdvlzqjujqztsbkizspckwebhvmngquf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224527.5650043-187-26006179018300/AnsiballZ_file.py'
Jan 12 13:28:47 compute-0 sudo[36010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:48 compute-0 python3.9[36012]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:28:48 compute-0 sudo[36010]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:48 compute-0 sudo[36162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuwdnzrgnljyveuwenywfdtvwsatrmvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224528.6678448-195-186215181286822/AnsiballZ_mount.py'
Jan 12 13:28:48 compute-0 sudo[36162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:49 compute-0 python3.9[36164]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 12 13:28:49 compute-0 sudo[36162]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:49 compute-0 sudo[36314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdxkpewcorcvyirighvacyxkjcfqvcjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224529.6746006-223-95402749458693/AnsiballZ_file.py'
Jan 12 13:28:49 compute-0 sudo[36314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:50 compute-0 python3.9[36316]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:28:50 compute-0 sudo[36314]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:50 compute-0 sudo[36466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiuxgjpixynlnhwwvwpeejhlfqucqpxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224530.1404474-231-241359914425489/AnsiballZ_stat.py'
Jan 12 13:28:50 compute-0 sudo[36466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:50 compute-0 python3.9[36468]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:28:50 compute-0 sudo[36466]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:50 compute-0 sudo[36589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqlavdcdvmekduzqaoujbggragbbyzvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224530.1404474-231-241359914425489/AnsiballZ_copy.py'
Jan 12 13:28:50 compute-0 sudo[36589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:50 compute-0 python3.9[36591]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224530.1404474-231-241359914425489/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0cac625bbd47decf33f75dea60fabb3d2b50744c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:28:50 compute-0 sudo[36589]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:51 compute-0 sudo[36741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tubexftglecvxllgqakrytawraebqwky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224531.2094889-255-223745760502667/AnsiballZ_stat.py'
Jan 12 13:28:51 compute-0 sudo[36741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:51 compute-0 python3.9[36743]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:28:51 compute-0 sudo[36741]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:51 compute-0 sudo[36893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmebrysnyzkduylnoqyaueihsaowpmtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224531.6631455-263-71338429172650/AnsiballZ_command.py'
Jan 12 13:28:51 compute-0 sudo[36893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:51 compute-0 python3.9[36895]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:28:52 compute-0 sudo[36893]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:52 compute-0 sudo[37046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feptuhwqyadqmcpcqdephydxivtutqxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224532.126434-271-82497661692647/AnsiballZ_file.py'
Jan 12 13:28:52 compute-0 sudo[37046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:52 compute-0 python3.9[37048]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:28:52 compute-0 sudo[37046]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:53 compute-0 sudo[37198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mepeueycdflwumfrgfeaglnjcpkfttty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224532.8021202-282-183269810855360/AnsiballZ_getent.py'
Jan 12 13:28:53 compute-0 sudo[37198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:55 compute-0 python3.9[37200]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 12 13:28:55 compute-0 sudo[37198]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:56 compute-0 sudo[37351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkzzpyhotceysbzlvcnmcwczfgyemzlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224535.9801314-290-39871756858275/AnsiballZ_group.py'
Jan 12 13:28:56 compute-0 sudo[37351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:56 compute-0 python3.9[37353]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 12 13:28:56 compute-0 groupadd[37354]: group added to /etc/group: name=qemu, GID=107
Jan 12 13:28:56 compute-0 groupadd[37354]: group added to /etc/gshadow: name=qemu
Jan 12 13:28:56 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:28:56 compute-0 groupadd[37354]: new group: name=qemu, GID=107
Jan 12 13:28:56 compute-0 sudo[37351]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:57 compute-0 sudo[37510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ougaaytjateqxiqqcsjainvyhsvhlsbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224536.6381934-298-277681338917033/AnsiballZ_user.py'
Jan 12 13:28:57 compute-0 sudo[37510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:57 compute-0 python3.9[37512]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 12 13:28:57 compute-0 useradd[37514]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 12 13:28:57 compute-0 sudo[37510]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:57 compute-0 sudo[37670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpdvjrvzhjbhtezboaavocjcwpcwvhyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224537.3790817-306-131356305297360/AnsiballZ_getent.py'
Jan 12 13:28:57 compute-0 sudo[37670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:57 compute-0 python3.9[37672]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 12 13:28:57 compute-0 sudo[37670]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:58 compute-0 sudo[37823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqkyxzepixwtlopcsjyuygzhwokckgas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224537.861669-314-2350047799724/AnsiballZ_group.py'
Jan 12 13:28:58 compute-0 sudo[37823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:58 compute-0 python3.9[37825]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 12 13:28:58 compute-0 groupadd[37826]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 12 13:28:58 compute-0 groupadd[37826]: group added to /etc/gshadow: name=hugetlbfs
Jan 12 13:28:58 compute-0 groupadd[37826]: new group: name=hugetlbfs, GID=42477
Jan 12 13:28:58 compute-0 sudo[37823]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:58 compute-0 sudo[37981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pokeenqymptdrehctmbdysnqjmiiycld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224538.4116018-323-234823622086077/AnsiballZ_file.py'
Jan 12 13:28:58 compute-0 sudo[37981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:58 compute-0 python3.9[37983]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 12 13:28:58 compute-0 sudo[37981]: pam_unix(sudo:session): session closed for user root
Jan 12 13:28:59 compute-0 sudo[38133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfmgpuigwcsxrbdhsvkkfrehfnesrqfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224539.0110648-334-178638566429416/AnsiballZ_dnf.py'
Jan 12 13:28:59 compute-0 sudo[38133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:28:59 compute-0 python3.9[38135]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:29:00 compute-0 sudo[38133]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:01 compute-0 sudo[38286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytzhdeyiawjvrhzvseowoigsofxjgxzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224540.8323817-342-148170725035842/AnsiballZ_file.py'
Jan 12 13:29:01 compute-0 sudo[38286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:01 compute-0 python3.9[38288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:29:01 compute-0 sudo[38286]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:01 compute-0 sudo[38438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckwudgnbgorwkapwjtzyrlavtqhztprh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224541.4062154-350-160788738518361/AnsiballZ_stat.py'
Jan 12 13:29:01 compute-0 sudo[38438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:01 compute-0 python3.9[38440]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:29:01 compute-0 sudo[38438]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:01 compute-0 sudo[38561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvlrdybbqfilhalnwurqbmzkyxwxwmnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224541.4062154-350-160788738518361/AnsiballZ_copy.py'
Jan 12 13:29:01 compute-0 sudo[38561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:02 compute-0 python3.9[38563]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224541.4062154-350-160788738518361/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:29:02 compute-0 sudo[38561]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:02 compute-0 sudo[38713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tckxputgqblmsducciyludgikeuszwll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224542.2427177-365-145036931333649/AnsiballZ_systemd.py'
Jan 12 13:29:02 compute-0 sudo[38713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:02 compute-0 python3.9[38715]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:29:02 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 12 13:29:03 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 12 13:29:03 compute-0 systemd-modules-load[38719]: Inserted module 'br_netfilter'
Jan 12 13:29:03 compute-0 kernel: Bridge firewalling registered
Jan 12 13:29:03 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 12 13:29:03 compute-0 sudo[38713]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:03 compute-0 sudo[38872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctbbjklyekswccdfydiweypblrfbcuql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224543.1773548-373-39172037444083/AnsiballZ_stat.py'
Jan 12 13:29:03 compute-0 sudo[38872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:03 compute-0 python3.9[38874]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:29:03 compute-0 sudo[38872]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:03 compute-0 sudo[38995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzkiotjfipjabcpnkazxwurjzktrqfex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224543.1773548-373-39172037444083/AnsiballZ_copy.py'
Jan 12 13:29:03 compute-0 sudo[38995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:03 compute-0 python3.9[38997]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224543.1773548-373-39172037444083/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:29:03 compute-0 sudo[38995]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:04 compute-0 sudo[39147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udmxoqavgpskhhgykwfzzcrxwermxemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224544.1925077-391-92169910724179/AnsiballZ_dnf.py'
Jan 12 13:29:04 compute-0 sudo[39147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:04 compute-0 python3.9[39149]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:29:09 compute-0 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Jan 12 13:29:09 compute-0 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Jan 12 13:29:09 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 12 13:29:09 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 12 13:29:09 compute-0 systemd[1]: Reloading.
Jan 12 13:29:09 compute-0 systemd-rc-local-generator[39207]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:29:09 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 12 13:29:09 compute-0 sudo[39147]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:10 compute-0 python3.9[40199]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:29:11 compute-0 python3.9[41025]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 12 13:29:11 compute-0 python3.9[41701]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:29:12 compute-0 sudo[42455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvaeaosanezsgqwonkrqkmjrhhzuvzjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224551.7821875-430-252254314507217/AnsiballZ_command.py'
Jan 12 13:29:12 compute-0 sudo[42455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:12 compute-0 python3.9[42478]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:29:12 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 12 13:29:12 compute-0 systemd[1]: Starting Authorization Manager...
Jan 12 13:29:12 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 12 13:29:12 compute-0 polkitd[43250]: Started polkitd version 0.117
Jan 12 13:29:12 compute-0 polkitd[43250]: Loading rules from directory /etc/polkit-1/rules.d
Jan 12 13:29:12 compute-0 polkitd[43250]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 12 13:29:12 compute-0 polkitd[43250]: Finished loading, compiling and executing 2 rules
Jan 12 13:29:12 compute-0 systemd[1]: Started Authorization Manager.
Jan 12 13:29:12 compute-0 polkitd[43250]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 12 13:29:12 compute-0 sudo[42455]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:12 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 12 13:29:12 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 12 13:29:12 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.503s CPU time.
Jan 12 13:29:12 compute-0 systemd[1]: run-r3303f372d7d04124b608f884fe977a2d.service: Deactivated successfully.
Jan 12 13:29:13 compute-0 sudo[43691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhvlcrwleuepjtcjnqugmxlmgdoregfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224552.8790026-439-209662628983648/AnsiballZ_systemd.py'
Jan 12 13:29:13 compute-0 sudo[43691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:13 compute-0 python3.9[43693]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:29:13 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 12 13:29:13 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Jan 12 13:29:13 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 12 13:29:13 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 12 13:29:13 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 12 13:29:13 compute-0 sudo[43691]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:14 compute-0 python3.9[43854]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 12 13:29:15 compute-0 sudo[44004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhytpkjoorsdxicocseetdsfxrhrviyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224555.143701-496-13677737489374/AnsiballZ_systemd.py'
Jan 12 13:29:15 compute-0 sudo[44004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:15 compute-0 python3.9[44006]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:29:15 compute-0 systemd[1]: Reloading.
Jan 12 13:29:15 compute-0 systemd-rc-local-generator[44029]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:29:15 compute-0 sudo[44004]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:16 compute-0 sudo[44193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvbvacqhfjbvgxhayaevsshepidwbczl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224555.8853052-496-190818495014216/AnsiballZ_systemd.py'
Jan 12 13:29:16 compute-0 sudo[44193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:16 compute-0 python3.9[44195]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:29:16 compute-0 systemd[1]: Reloading.
Jan 12 13:29:16 compute-0 systemd-rc-local-generator[44222]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:29:16 compute-0 sudo[44193]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:16 compute-0 sudo[44382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgfbkkgvnhbzqijsouywazchwpodkvia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224556.7039652-512-255204865863106/AnsiballZ_command.py'
Jan 12 13:29:16 compute-0 sudo[44382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:17 compute-0 python3.9[44384]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:29:17 compute-0 sudo[44382]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:17 compute-0 sudo[44535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjeifbszwzpoeeldjwdgaxnexgeyffxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224557.1645677-520-154854096762423/AnsiballZ_command.py'
Jan 12 13:29:17 compute-0 sudo[44535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:17 compute-0 python3.9[44537]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:29:17 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 12 13:29:17 compute-0 sudo[44535]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:17 compute-0 sudo[44688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okkbozcvyhqmedocelmxuekxizslmswl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224557.6698227-528-278377973088321/AnsiballZ_command.py'
Jan 12 13:29:17 compute-0 sudo[44688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:18 compute-0 python3.9[44690]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:29:19 compute-0 sudo[44688]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:19 compute-0 sudo[44850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otddhquvszngjpoyzqijovqttnbartgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224559.3082645-536-52560075937047/AnsiballZ_command.py'
Jan 12 13:29:19 compute-0 sudo[44850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:19 compute-0 python3.9[44852]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:29:19 compute-0 sudo[44850]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:19 compute-0 sudo[45003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtoitcylywsvnotjbgxpfbngscktkrtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224559.7842486-544-92466362193988/AnsiballZ_systemd.py'
Jan 12 13:29:19 compute-0 sudo[45003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:20 compute-0 python3.9[45005]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:29:20 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 12 13:29:20 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Jan 12 13:29:20 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Jan 12 13:29:20 compute-0 systemd[1]: Starting Apply Kernel Variables...
Jan 12 13:29:20 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 12 13:29:20 compute-0 systemd[1]: Finished Apply Kernel Variables.
Jan 12 13:29:20 compute-0 sudo[45003]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:20 compute-0 sshd-session[31420]: Connection closed by 192.168.122.30 port 36284
Jan 12 13:29:20 compute-0 sshd-session[31417]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:29:20 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Jan 12 13:29:20 compute-0 systemd[1]: session-9.scope: Consumed 1min 46.629s CPU time.
Jan 12 13:29:20 compute-0 systemd-logind[775]: Session 9 logged out. Waiting for processes to exit.
Jan 12 13:29:20 compute-0 systemd-logind[775]: Removed session 9.
Jan 12 13:29:25 compute-0 sshd-session[45036]: Accepted publickey for zuul from 192.168.122.30 port 45122 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:29:25 compute-0 systemd-logind[775]: New session 10 of user zuul.
Jan 12 13:29:25 compute-0 systemd[1]: Started Session 10 of User zuul.
Jan 12 13:29:25 compute-0 sshd-session[45036]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:29:25 compute-0 python3.9[45189]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:29:26 compute-0 python3.9[45343]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:29:27 compute-0 sudo[45497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpnlvxkxuwtrpncxmyqdadespfaytcob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224567.2357793-45-51556899029100/AnsiballZ_command.py'
Jan 12 13:29:27 compute-0 sudo[45497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:27 compute-0 python3.9[45499]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:29:27 compute-0 sudo[45497]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:28 compute-0 python3.9[45650]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:29:28 compute-0 sudo[45804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddwhgyhdtwhajvcqeejjomjksviskohz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224568.7333062-65-161707117772529/AnsiballZ_setup.py'
Jan 12 13:29:28 compute-0 sudo[45804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:29 compute-0 python3.9[45806]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:29:29 compute-0 sudo[45804]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:29 compute-0 sudo[45888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdzexbqgfnbrhszukmccehbmkiiemowx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224568.7333062-65-161707117772529/AnsiballZ_dnf.py'
Jan 12 13:29:29 compute-0 sudo[45888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:29 compute-0 python3.9[45890]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:29:30 compute-0 sudo[45888]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:31 compute-0 sudo[46041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibvcsricsvxzenezdwjaaudylqyhymwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224571.0371685-77-192348007732876/AnsiballZ_setup.py'
Jan 12 13:29:31 compute-0 sudo[46041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:31 compute-0 python3.9[46043]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:29:31 compute-0 sudo[46041]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:32 compute-0 sudo[46212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytfovmgdjqllcqnsilsgdotebjvkaznx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224571.8136208-88-225431774409079/AnsiballZ_file.py'
Jan 12 13:29:32 compute-0 sudo[46212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:32 compute-0 python3.9[46214]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:29:32 compute-0 sudo[46212]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:32 compute-0 sudo[46364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdddslzugjzqdfvsumzfiznxambzogsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224572.447485-96-47686578316674/AnsiballZ_command.py'
Jan 12 13:29:32 compute-0 sudo[46364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:32 compute-0 python3.9[46366]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:29:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1550149366-merged.mount: Deactivated successfully.
Jan 12 13:29:32 compute-0 podman[46367]: 2026-01-12 13:29:32.862894419 +0000 UTC m=+0.039900683 system refresh
Jan 12 13:29:32 compute-0 sudo[46364]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:33 compute-0 sudo[46526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okfmgfteedaludfslroambtnjwkrtbst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224573.023144-104-205970179761394/AnsiballZ_stat.py'
Jan 12 13:29:33 compute-0 sudo[46526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:33 compute-0 python3.9[46528]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:29:33 compute-0 sudo[46526]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:29:33 compute-0 sudo[46649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohxjjzntwjaussjjypityckxcdqmtahk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224573.023144-104-205970179761394/AnsiballZ_copy.py'
Jan 12 13:29:33 compute-0 sudo[46649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:34 compute-0 python3.9[46651]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224573.023144-104-205970179761394/.source.json follow=False _original_basename=podman_network_config.j2 checksum=45520895fbb3dcb52b340a07c332cb9efa12aeb0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:29:34 compute-0 sudo[46649]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:34 compute-0 sudo[46801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psqtrcbbuksumrorxxysobtlnkbwhydv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224574.1878557-119-11880791160479/AnsiballZ_stat.py'
Jan 12 13:29:34 compute-0 sudo[46801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:34 compute-0 python3.9[46803]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:29:34 compute-0 sudo[46801]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:34 compute-0 sudo[46924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvkahksvovsycoedrdecujxaknbevqey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224574.1878557-119-11880791160479/AnsiballZ_copy.py'
Jan 12 13:29:34 compute-0 sudo[46924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:34 compute-0 python3.9[46926]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224574.1878557-119-11880791160479/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:29:34 compute-0 sudo[46924]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:35 compute-0 sudo[47076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cudtycfgvhuthoumkumbucildnbdjxug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224575.135443-135-44758958971749/AnsiballZ_ini_file.py'
Jan 12 13:29:35 compute-0 sudo[47076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:35 compute-0 python3.9[47078]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:29:35 compute-0 sudo[47076]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:35 compute-0 sudo[47228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpqttuetxigohwzuccjbqyrhtjuxgzvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224575.7365313-135-251919469075161/AnsiballZ_ini_file.py'
Jan 12 13:29:35 compute-0 sudo[47228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:36 compute-0 python3.9[47230]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:29:36 compute-0 sudo[47228]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:36 compute-0 sudo[47380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkryzomndjnhuqsmdwlmrpthophvtcpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224576.1984618-135-260473897619264/AnsiballZ_ini_file.py'
Jan 12 13:29:36 compute-0 sudo[47380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:36 compute-0 python3.9[47382]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:29:36 compute-0 sudo[47380]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:36 compute-0 sudo[47532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfxjlnlewnhheypdozpkdilisdgkskcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224576.6495886-135-2253237931455/AnsiballZ_ini_file.py'
Jan 12 13:29:36 compute-0 sudo[47532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:36 compute-0 python3.9[47534]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:29:36 compute-0 sudo[47532]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:37 compute-0 python3.9[47684]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:29:37 compute-0 sudo[47836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uesbmrheqdyolqjkzsdiljpwaywtmllt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224577.8143551-175-28010460526942/AnsiballZ_dnf.py'
Jan 12 13:29:37 compute-0 sudo[47836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:38 compute-0 python3.9[47838]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:29:39 compute-0 sudo[47836]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:39 compute-0 sudo[47989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzvjiyuekozwzzdpxegohjmvrcgomtki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224579.2765582-183-42713679904939/AnsiballZ_dnf.py'
Jan 12 13:29:39 compute-0 sudo[47989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:39 compute-0 python3.9[47991]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:29:42 compute-0 sudo[47989]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:43 compute-0 sudo[48151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slbrbbsehsadbuqywmqhekbspuehizgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224583.0767474-193-20820309496695/AnsiballZ_dnf.py'
Jan 12 13:29:43 compute-0 sudo[48151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:43 compute-0 python3.9[48153]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:29:44 compute-0 sudo[48151]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:44 compute-0 sudo[48304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfntzeuobxpkrzwqxjgzjytshweycdav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224584.5762653-202-45772350437919/AnsiballZ_dnf.py'
Jan 12 13:29:44 compute-0 sudo[48304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:44 compute-0 python3.9[48306]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:29:45 compute-0 sudo[48304]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:46 compute-0 sudo[48457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igcvtcvtinykpgttmxvxfqgqfknorecd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224586.1382775-213-216178809192455/AnsiballZ_dnf.py'
Jan 12 13:29:46 compute-0 sudo[48457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:46 compute-0 python3.9[48459]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:29:48 compute-0 sudo[48457]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:48 compute-0 sudo[48613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gezyytoscybbdculcfdgrynsnyuvliqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224588.2166977-221-52777181367797/AnsiballZ_dnf.py'
Jan 12 13:29:48 compute-0 sudo[48613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:48 compute-0 python3.9[48615]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:29:52 compute-0 sudo[48613]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:53 compute-0 sudo[48783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldthhmttgoxaaxcmskjlbfolgndcsbkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224592.9120936-230-236057066605046/AnsiballZ_dnf.py'
Jan 12 13:29:53 compute-0 sudo[48783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:53 compute-0 python3.9[48785]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:29:54 compute-0 sudo[48783]: pam_unix(sudo:session): session closed for user root
Jan 12 13:29:54 compute-0 sudo[48936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngfzrexdopocqhtdusiouapmcnyvxkvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224594.441284-239-99321554389020/AnsiballZ_dnf.py'
Jan 12 13:29:54 compute-0 sudo[48936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:29:54 compute-0 python3.9[48938]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:30:10 compute-0 sudo[48936]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:10 compute-0 sudo[49275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkrvubooozisphghmjlsoaoccqbzrhim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224610.7334416-248-50528654627405/AnsiballZ_dnf.py'
Jan 12 13:30:10 compute-0 sudo[49275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:11 compute-0 python3.9[49277]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:30:12 compute-0 sudo[49275]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:12 compute-0 sudo[49431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imylxcfkcsbjwfzqqnhimohifeowthvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224612.3132596-258-151856227615701/AnsiballZ_dnf.py'
Jan 12 13:30:12 compute-0 sudo[49431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:12 compute-0 python3.9[49433]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:30:15 compute-0 sudo[49431]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:15 compute-0 sudo[49588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqbdfqoqfwipyeadrzuwduwwtqinuime ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224615.6122236-269-49637515797737/AnsiballZ_file.py'
Jan 12 13:30:15 compute-0 sudo[49588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:15 compute-0 python3.9[49590]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:30:15 compute-0 sudo[49588]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:16 compute-0 sudo[49763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gozhfodqvmgoertxmskqjsmbepeweixs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224616.0766494-277-24307965069209/AnsiballZ_stat.py'
Jan 12 13:30:16 compute-0 sudo[49763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:16 compute-0 python3.9[49765]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:30:16 compute-0 sudo[49763]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:16 compute-0 sudo[49886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrqfehojbooikylygqcbbqvjeijxwfzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224616.0766494-277-24307965069209/AnsiballZ_copy.py'
Jan 12 13:30:16 compute-0 sudo[49886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:16 compute-0 python3.9[49888]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1768224616.0766494-277-24307965069209/.source.json _original_basename=.p8iritms follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:30:16 compute-0 sudo[49886]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:17 compute-0 sudo[50038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmwidrhjsinnnyfkixdlmlqjffbkkmqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224617.0326548-295-19950113709579/AnsiballZ_podman_image.py'
Jan 12 13:30:17 compute-0 sudo[50038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:17 compute-0 python3.9[50040]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 12 13:30:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1824680241-merged.mount: Deactivated successfully.
Jan 12 13:30:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1824680241-lower\x2dmapped.mount: Deactivated successfully.
Jan 12 13:30:23 compute-0 podman[50051]: 2026-01-12 13:30:23.824910721 +0000 UTC m=+6.225351553 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 12 13:30:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:24 compute-0 sudo[50038]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:24 compute-0 sudo[50317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faenhazvurnhlcgrvemqmdertxrtnizt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224624.2655184-306-227064318789897/AnsiballZ_podman_image.py'
Jan 12 13:30:24 compute-0 sudo[50317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:24 compute-0 python3.9[50319]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 12 13:30:35 compute-0 podman[50329]: 2026-01-12 13:30:35.031258302 +0000 UTC m=+10.337856910 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:30:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:35 compute-0 sudo[50317]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:35 compute-0 sudo[50592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpreknbwnvwxgvymserrwejeajydzoot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224635.4407241-316-116931279894241/AnsiballZ_podman_image.py'
Jan 12 13:30:35 compute-0 sudo[50592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:35 compute-0 python3.9[50594]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 12 13:30:48 compute-0 podman[50604]: 2026-01-12 13:30:48.871110567 +0000 UTC m=+12.982460051 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 12 13:30:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:49 compute-0 sudo[50592]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:49 compute-0 sudo[50836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuxnanzumklcswfcorhimzbfarkjeini ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224649.3247008-327-139668109491362/AnsiballZ_podman_image.py'
Jan 12 13:30:49 compute-0 sudo[50836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:49 compute-0 python3.9[50838]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 12 13:30:54 compute-0 podman[50848]: 2026-01-12 13:30:54.786787384 +0000 UTC m=+5.052700910 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf
Jan 12 13:30:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:54 compute-0 sudo[50836]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:55 compute-0 sudo[51080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpcrhyvdmxaswxrdfpcvwkyszopaxzhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224655.083542-327-35859149450870/AnsiballZ_podman_image.py'
Jan 12 13:30:55 compute-0 sudo[51080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:30:55 compute-0 python3.9[51082]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 12 13:30:57 compute-0 podman[51092]: 2026-01-12 13:30:57.603741898 +0000 UTC m=+2.088695033 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Jan 12 13:30:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:30:57 compute-0 sudo[51080]: pam_unix(sudo:session): session closed for user root
Jan 12 13:30:58 compute-0 sshd-session[45039]: Connection closed by 192.168.122.30 port 45122
Jan 12 13:30:58 compute-0 sshd-session[45036]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:30:58 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Jan 12 13:30:58 compute-0 systemd[1]: session-10.scope: Consumed 1min 31.793s CPU time.
Jan 12 13:30:58 compute-0 systemd-logind[775]: Session 10 logged out. Waiting for processes to exit.
Jan 12 13:30:58 compute-0 systemd-logind[775]: Removed session 10.
Jan 12 13:31:03 compute-0 sshd-session[51212]: Accepted publickey for zuul from 192.168.122.30 port 40872 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:31:03 compute-0 systemd-logind[775]: New session 11 of user zuul.
Jan 12 13:31:03 compute-0 systemd[1]: Started Session 11 of User zuul.
Jan 12 13:31:03 compute-0 sshd-session[51212]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:31:04 compute-0 python3.9[51365]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:31:05 compute-0 sudo[51519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fijnwcdbcpyrvaqxdqeoplqggjnktxnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224664.8259745-31-112474166596175/AnsiballZ_getent.py'
Jan 12 13:31:05 compute-0 sudo[51519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:05 compute-0 python3.9[51521]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 12 13:31:05 compute-0 sudo[51519]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:05 compute-0 sudo[51672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crguigcscrvmkouiebdyjhcipmbushoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224665.4458776-39-259579518765614/AnsiballZ_group.py'
Jan 12 13:31:05 compute-0 sudo[51672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:05 compute-0 python3.9[51674]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 12 13:31:05 compute-0 groupadd[51675]: group added to /etc/group: name=openvswitch, GID=42476
Jan 12 13:31:05 compute-0 groupadd[51675]: group added to /etc/gshadow: name=openvswitch
Jan 12 13:31:05 compute-0 groupadd[51675]: new group: name=openvswitch, GID=42476
Jan 12 13:31:05 compute-0 sudo[51672]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:06 compute-0 sudo[51830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfxawvgdkckxufesxhqwunhdfumfusjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224666.1255782-47-5970754399935/AnsiballZ_user.py'
Jan 12 13:31:06 compute-0 sudo[51830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:06 compute-0 python3.9[51832]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 12 13:31:06 compute-0 useradd[51834]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 12 13:31:06 compute-0 useradd[51834]: add 'openvswitch' to group 'hugetlbfs'
Jan 12 13:31:06 compute-0 useradd[51834]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 12 13:31:06 compute-0 sudo[51830]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:07 compute-0 sudo[51990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yemcculvidacqdipozmoquiajqnfbzuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224666.9296708-57-19018843390655/AnsiballZ_setup.py'
Jan 12 13:31:07 compute-0 sudo[51990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:07 compute-0 python3.9[51992]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:31:07 compute-0 sudo[51990]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:07 compute-0 sudo[52074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlophlpqntjkqtihxposdqwtsshngcxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224666.9296708-57-19018843390655/AnsiballZ_dnf.py'
Jan 12 13:31:07 compute-0 sudo[52074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:08 compute-0 python3.9[52076]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:31:11 compute-0 sudo[52074]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:11 compute-0 sudo[52236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsgbrwhpxxpgpkotrqbcdqokjvhazvxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224671.5492098-71-126420987506056/AnsiballZ_dnf.py'
Jan 12 13:31:11 compute-0 sudo[52236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:11 compute-0 python3.9[52238]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:31:20 compute-0 kernel: SELinux:  Converting 2731 SID table entries...
Jan 12 13:31:20 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 12 13:31:20 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 12 13:31:20 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 12 13:31:20 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 12 13:31:20 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 12 13:31:20 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 12 13:31:20 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 12 13:31:20 compute-0 groupadd[52261]: group added to /etc/group: name=unbound, GID=994
Jan 12 13:31:20 compute-0 groupadd[52261]: group added to /etc/gshadow: name=unbound
Jan 12 13:31:20 compute-0 groupadd[52261]: new group: name=unbound, GID=994
Jan 12 13:31:20 compute-0 useradd[52268]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 12 13:31:20 compute-0 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 12 13:31:20 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 12 13:31:21 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 12 13:31:21 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 12 13:31:21 compute-0 systemd[1]: Reloading.
Jan 12 13:31:21 compute-0 systemd-sysv-generator[52762]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:31:21 compute-0 systemd-rc-local-generator[52759]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:31:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 12 13:31:21 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 12 13:31:21 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 12 13:31:21 compute-0 systemd[1]: run-rf08af1d032f4446783f69f3929106dc2.service: Deactivated successfully.
Jan 12 13:31:21 compute-0 sudo[52236]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:22 compute-0 sudo[53334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyfeawtmpnszlkjirntzrlxfphtapciu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224682.056999-79-247039695624531/AnsiballZ_systemd.py'
Jan 12 13:31:22 compute-0 sudo[53334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:22 compute-0 python3.9[53336]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 12 13:31:22 compute-0 systemd[1]: Reloading.
Jan 12 13:31:22 compute-0 systemd-rc-local-generator[53357]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:31:22 compute-0 systemd-sysv-generator[53362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:31:22 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Jan 12 13:31:22 compute-0 chown[53378]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 12 13:31:23 compute-0 ovs-ctl[53383]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 12 13:31:23 compute-0 ovs-ctl[53383]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 12 13:31:23 compute-0 ovs-ctl[53383]: Starting ovsdb-server [  OK  ]
Jan 12 13:31:23 compute-0 ovs-vsctl[53433]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 12 13:31:23 compute-0 ovs-vsctl[53452]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"9c2d4250-79a9-4504-9090-d7395fcb2080\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 12 13:31:23 compute-0 ovs-ctl[53383]: Configuring Open vSwitch system IDs [  OK  ]
Jan 12 13:31:23 compute-0 ovs-ctl[53383]: Enabling remote OVSDB managers [  OK  ]
Jan 12 13:31:23 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Jan 12 13:31:23 compute-0 ovs-vsctl[53458]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 12 13:31:23 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 12 13:31:23 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 12 13:31:23 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 12 13:31:23 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Jan 12 13:31:23 compute-0 ovs-ctl[53502]: Inserting openvswitch module [  OK  ]
Jan 12 13:31:23 compute-0 ovs-ctl[53471]: Starting ovs-vswitchd [  OK  ]
Jan 12 13:31:23 compute-0 ovs-ctl[53471]: Enabling remote OVSDB managers [  OK  ]
Jan 12 13:31:23 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 12 13:31:23 compute-0 ovs-vsctl[53521]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 12 13:31:23 compute-0 systemd[1]: Starting Open vSwitch...
Jan 12 13:31:23 compute-0 systemd[1]: Finished Open vSwitch.
Jan 12 13:31:23 compute-0 sudo[53334]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:23 compute-0 python3.9[53672]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:31:24 compute-0 sudo[53822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftnnnzzsypgjlkmblauwjdvkszxifohn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224684.089136-97-161101675501135/AnsiballZ_sefcontext.py'
Jan 12 13:31:24 compute-0 sudo[53822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:24 compute-0 python3.9[53824]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 12 13:31:25 compute-0 kernel: SELinux:  Converting 2745 SID table entries...
Jan 12 13:31:25 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 12 13:31:25 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 12 13:31:25 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 12 13:31:25 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 12 13:31:25 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 12 13:31:25 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 12 13:31:25 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 12 13:31:25 compute-0 sudo[53822]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:26 compute-0 python3.9[53979]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:31:26 compute-0 sudo[54135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfvmqmgfhrsttaspaktsibqnucippvok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224686.3739932-115-45506157512201/AnsiballZ_dnf.py'
Jan 12 13:31:26 compute-0 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 12 13:31:26 compute-0 sudo[54135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:26 compute-0 python3.9[54137]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:31:27 compute-0 sudo[54135]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:28 compute-0 sudo[54288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmhqtroaxwzkdntjdkxrwcmmesztndmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224687.7813323-123-134001976906747/AnsiballZ_command.py'
Jan 12 13:31:28 compute-0 sudo[54288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:28 compute-0 python3.9[54290]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:31:28 compute-0 sudo[54288]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:29 compute-0 sudo[54575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpxuwbtjlvlcnlcpjqsgynnsbifaezva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224688.8566597-131-104345837548651/AnsiballZ_file.py'
Jan 12 13:31:29 compute-0 sudo[54575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:29 compute-0 python3.9[54577]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 12 13:31:29 compute-0 sudo[54575]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:29 compute-0 python3.9[54727]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:31:30 compute-0 sudo[54879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grzvgthieswjnmmlizhbowtrdsbngaje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224689.9914844-147-32883801813318/AnsiballZ_dnf.py'
Jan 12 13:31:30 compute-0 sudo[54879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:30 compute-0 python3.9[54881]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:31:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 12 13:31:33 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 12 13:31:33 compute-0 systemd[1]: Reloading.
Jan 12 13:31:33 compute-0 systemd-rc-local-generator[54915]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:31:33 compute-0 systemd-sysv-generator[54918]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:31:33 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 12 13:31:33 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 12 13:31:33 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 12 13:31:33 compute-0 systemd[1]: run-rd6a89039e0c54c57b83a219e64882701.service: Deactivated successfully.
Jan 12 13:31:33 compute-0 sudo[54879]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:34 compute-0 sudo[55197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynlglimnycspnwxljdqbvhzifjddqovh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224693.9643514-155-78306262108000/AnsiballZ_systemd.py'
Jan 12 13:31:34 compute-0 sudo[55197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:34 compute-0 python3.9[55199]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:31:34 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 12 13:31:34 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Jan 12 13:31:34 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Jan 12 13:31:34 compute-0 systemd[1]: Stopping Network Manager...
Jan 12 13:31:34 compute-0 NetworkManager[7251]: <info>  [1768224694.4437] caught SIGTERM, shutting down normally.
Jan 12 13:31:34 compute-0 NetworkManager[7251]: <info>  [1768224694.4449] dhcp4 (eth0): canceled DHCP transaction
Jan 12 13:31:34 compute-0 NetworkManager[7251]: <info>  [1768224694.4450] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:31:34 compute-0 NetworkManager[7251]: <info>  [1768224694.4450] dhcp4 (eth0): state changed no lease
Jan 12 13:31:34 compute-0 NetworkManager[7251]: <info>  [1768224694.4451] dhcp6 (eth0): canceled DHCP transaction
Jan 12 13:31:34 compute-0 NetworkManager[7251]: <info>  [1768224694.4451] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:31:34 compute-0 NetworkManager[7251]: <info>  [1768224694.4451] dhcp6 (eth0): state changed no lease
Jan 12 13:31:34 compute-0 NetworkManager[7251]: <info>  [1768224694.4452] manager: NetworkManager state is now CONNECTED_SITE
Jan 12 13:31:34 compute-0 NetworkManager[7251]: <info>  [1768224694.4479] exiting (success)
Jan 12 13:31:34 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 12 13:31:34 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 12 13:31:34 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 12 13:31:34 compute-0 systemd[1]: Stopped Network Manager.
Jan 12 13:31:34 compute-0 systemd[1]: Starting Network Manager...
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5102] NetworkManager (version 1.54.2-1.el9) is starting... (after a restart, boot:94c17a7f-65c5-449e-af1b-e34d5bf3c7ea)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5104] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5153] manager[0x563d97abb000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 12 13:31:34 compute-0 systemd[1]: Starting Hostname Service...
Jan 12 13:31:34 compute-0 systemd[1]: Started Hostname Service.
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5763] hostname: hostname: using hostnamed
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5765] hostname: static hostname changed from (none) to "compute-0"
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5767] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5770] manager[0x563d97abb000]: rfkill: Wi-Fi hardware radio set enabled
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5770] manager[0x563d97abb000]: rfkill: WWAN hardware radio set enabled
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5786] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-ovs.so)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5793] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5793] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5793] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5794] manager: Networking is enabled by state file
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5796] settings: Loaded settings plugin: keyfile (internal)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5798] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5817] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5824] dhcp: init: Using DHCP client 'internal'
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5826] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5831] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5836] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5841] device (lo): Activation: starting connection 'lo' (32e0de59-2ecd-409f-9e28-312d4eb0815d)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5846] device (eth0): carrier: link connected
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5849] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5852] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5852] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5857] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5862] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5866] device (eth1): carrier: link connected
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5869] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5872] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (cd22dd6a-6b6d-5d12-97bb-b6ad4ed87c99) (indicated)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5872] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5876] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5880] device (eth1): Activation: starting connection 'ci-private-network' (cd22dd6a-6b6d-5d12-97bb-b6ad4ed87c99)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5884] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 12 13:31:34 compute-0 systemd[1]: Started Network Manager.
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5906] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5908] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5909] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5911] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5913] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5914] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5916] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5918] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5922] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5925] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5927] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5932] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5935] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5941] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:31:34 compute-0 systemd[1]: Starting Network Manager Wait Online...
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5963] dhcp4 (eth0): state changed new lease, address=192.168.25.114
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.5979] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.6008] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.6010] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.6011] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.6014] device (lo): Activation: successful, device activated.
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.6018] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.6023] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 12 13:31:34 compute-0 NetworkManager[55211]: <info>  [1768224694.6024] device (eth1): Activation: successful, device activated.
Jan 12 13:31:34 compute-0 sudo[55197]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:34 compute-0 sudo[55406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztnmbsrqpzfqqwdrsuxrohaynsdqnxxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224694.7639925-163-246409041776785/AnsiballZ_dnf.py'
Jan 12 13:31:34 compute-0 sudo[55406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:35 compute-0 python3.9[55408]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:31:35 compute-0 NetworkManager[55211]: <info>  [1768224695.6217] dhcp6 (eth0): state changed new lease, address=2001:db8::10a
Jan 12 13:31:35 compute-0 NetworkManager[55211]: <info>  [1768224695.6227] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 12 13:31:35 compute-0 NetworkManager[55211]: <info>  [1768224695.6267] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 12 13:31:35 compute-0 NetworkManager[55211]: <info>  [1768224695.6269] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 12 13:31:35 compute-0 NetworkManager[55211]: <info>  [1768224695.6271] manager: NetworkManager state is now CONNECTED_SITE
Jan 12 13:31:35 compute-0 NetworkManager[55211]: <info>  [1768224695.6272] device (eth0): Activation: successful, device activated.
Jan 12 13:31:35 compute-0 NetworkManager[55211]: <info>  [1768224695.6275] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 12 13:31:35 compute-0 NetworkManager[55211]: <info>  [1768224695.6276] manager: startup complete
Jan 12 13:31:35 compute-0 systemd[1]: Finished Network Manager Wait Online.
Jan 12 13:31:41 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 12 13:31:41 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 12 13:31:41 compute-0 systemd[1]: Reloading.
Jan 12 13:31:41 compute-0 systemd-sysv-generator[55477]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:31:41 compute-0 systemd-rc-local-generator[55474]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:31:41 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 12 13:31:42 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 12 13:31:42 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 12 13:31:42 compute-0 systemd[1]: run-re9cd7920d7fe4016894d163b4099dd08.service: Deactivated successfully.
Jan 12 13:31:42 compute-0 sudo[55406]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:42 compute-0 sudo[55885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xndroeiwmuobofgtntjneqqlbcdlhvsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224702.7475731-175-156791003364398/AnsiballZ_stat.py'
Jan 12 13:31:42 compute-0 sudo[55885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:43 compute-0 python3.9[55887]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:31:43 compute-0 sudo[55885]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:43 compute-0 sudo[56037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duqyadvckexylwrhohtwbwcehrawkckw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224703.192866-184-15332344979043/AnsiballZ_ini_file.py'
Jan 12 13:31:43 compute-0 sudo[56037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:43 compute-0 python3.9[56039]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:43 compute-0 sudo[56037]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:44 compute-0 sudo[56191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttewcnojklxmgqbttryuuxvittfwoejw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224703.9057646-194-166595835268340/AnsiballZ_ini_file.py'
Jan 12 13:31:44 compute-0 sudo[56191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:44 compute-0 python3.9[56193]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:44 compute-0 sudo[56191]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:44 compute-0 sudo[56343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzhlughsootilxpmudndydgthxgszuqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224704.343151-194-227014544919364/AnsiballZ_ini_file.py'
Jan 12 13:31:44 compute-0 sudo[56343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:44 compute-0 python3.9[56345]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:44 compute-0 sudo[56343]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:44 compute-0 sudo[56497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukqexedeiijajeddvrngphpztxraqijq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224704.808551-209-31660823608370/AnsiballZ_ini_file.py'
Jan 12 13:31:44 compute-0 sudo[56497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:45 compute-0 python3.9[56499]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:45 compute-0 sudo[56497]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:45 compute-0 sudo[56649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maanqlzwrkwbssdtixgvpjdfsfjmecpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224705.2334416-209-20490340171482/AnsiballZ_ini_file.py'
Jan 12 13:31:45 compute-0 sudo[56649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:45 compute-0 python3.9[56651]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:45 compute-0 sudo[56649]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:45 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 12 13:31:45 compute-0 sudo[56801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubbiuxbmaoodyrcteqqyqsjxxcvwdjcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224705.6938386-224-152484639372351/AnsiballZ_stat.py'
Jan 12 13:31:45 compute-0 sudo[56801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:46 compute-0 python3.9[56803]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:31:46 compute-0 sudo[56801]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:46 compute-0 sudo[56924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbncqixncpczoicfnivwfgcopepsghsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224705.6938386-224-152484639372351/AnsiballZ_copy.py'
Jan 12 13:31:46 compute-0 sudo[56924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:46 compute-0 python3.9[56926]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224705.6938386-224-152484639372351/.source _original_basename=.tksfzxux follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:46 compute-0 sudo[56924]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:46 compute-0 sudo[57076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iflropwsktcoplsyrbexqhwfpxgbzswj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224706.6229045-239-207622091570597/AnsiballZ_file.py'
Jan 12 13:31:46 compute-0 sudo[57076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:46 compute-0 python3.9[57078]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:46 compute-0 sudo[57076]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:47 compute-0 sudo[57228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzujgxeiaboyehjydiwzyunnjaqahced ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224707.0856576-247-153773249626518/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 12 13:31:47 compute-0 sudo[57228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:47 compute-0 python3.9[57230]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 12 13:31:47 compute-0 sudo[57228]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:47 compute-0 sudo[57380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tybwdttyqthkzepcskxkhnalejfcyxuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224707.7018495-256-170883370731701/AnsiballZ_file.py'
Jan 12 13:31:47 compute-0 sudo[57380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:48 compute-0 python3.9[57382]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:48 compute-0 sudo[57380]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:48 compute-0 sudo[57532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugysyxjdvbljulrfgahvxhmuhujppqfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224708.23953-266-223110901716758/AnsiballZ_stat.py'
Jan 12 13:31:48 compute-0 sudo[57532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:48 compute-0 sudo[57532]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:48 compute-0 sudo[57655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfrdwkxadflxcwndxclqaykplmxerrav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224708.23953-266-223110901716758/AnsiballZ_copy.py'
Jan 12 13:31:48 compute-0 sudo[57655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:48 compute-0 sudo[57655]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:49 compute-0 sudo[57807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymltttwzpxgsugyrftyobhdqofylfnyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224709.0514457-281-229664143128133/AnsiballZ_slurp.py'
Jan 12 13:31:49 compute-0 sudo[57807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:49 compute-0 python3.9[57809]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 12 13:31:49 compute-0 sudo[57807]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:50 compute-0 sudo[57982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlxqtuahqruuddqcjngsffxhowuqepbr ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224709.6612294-290-44731197478684/async_wrapper.py j224852308323 300 /home/zuul/.ansible/tmp/ansible-tmp-1768224709.6612294-290-44731197478684/AnsiballZ_edpm_os_net_config.py _'
Jan 12 13:31:50 compute-0 sudo[57982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:50 compute-0 ansible-async_wrapper.py[57984]: Invoked with j224852308323 300 /home/zuul/.ansible/tmp/ansible-tmp-1768224709.6612294-290-44731197478684/AnsiballZ_edpm_os_net_config.py _
Jan 12 13:31:50 compute-0 ansible-async_wrapper.py[57987]: Starting module and watcher
Jan 12 13:31:50 compute-0 ansible-async_wrapper.py[57987]: Start watching 57988 (300)
Jan 12 13:31:50 compute-0 ansible-async_wrapper.py[57988]: Start module (57988)
Jan 12 13:31:50 compute-0 ansible-async_wrapper.py[57984]: Return async_wrapper task started.
Jan 12 13:31:50 compute-0 sudo[57982]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:50 compute-0 python3.9[57989]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 12 13:31:51 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 12 13:31:51 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 12 13:31:51 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 12 13:31:51 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 12 13:31:51 compute-0 kernel: cfg80211: failed to load regulatory.db
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8119] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8133] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8537] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8539] audit: op="connection-add" uuid="76f3c824-e4b5-4103-9334-3f6bfd42c218" name="br-ex-br" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8550] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8551] audit: op="connection-add" uuid="edb0f0f2-926d-4d67-8e3f-13d6a9af18f0" name="br-ex-port" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8561] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8562] audit: op="connection-add" uuid="533851a8-9ff3-4f22-8615-086bbfda2345" name="eth1-port" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8571] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8572] audit: op="connection-add" uuid="3a364d80-f698-43a4-a081-0b80cdde34c5" name="vlan20-port" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8582] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8583] audit: op="connection-add" uuid="30d3a37c-9bb1-4fe8-91e1-cf7c9b35f06d" name="vlan21-port" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8592] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8593] audit: op="connection-add" uuid="c3c074f4-73c8-46ba-b76a-82cf4b218e39" name="vlan22-port" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8609] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.may-fail,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv6.routes,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8623] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8624] audit: op="connection-add" uuid="4f331a68-ef25-4cc3-b893-72f660d5a76e" name="br-ex-if" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8644] audit: op="connection-update" uuid="cd22dd6a-6b6d-5d12-97bb-b6ad4ed87c99" name="ci-private-network" args="ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.dns,ipv6.addresses,ipv6.method,ipv6.routes,ipv4.never-default,ipv4.routing-rules,ipv4.dns,ipv4.addresses,ipv4.method,ipv4.routes,ovs-external-ids.data,connection.controller,connection.timestamp,connection.master,connection.slave-type,connection.port-type,ovs-interface.type" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8657] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8658] audit: op="connection-add" uuid="fd94e02e-9184-4c1a-b25b-dc1d536763fc" name="vlan20-if" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8671] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8672] audit: op="connection-add" uuid="73922ef6-38d9-490d-95d0-413297807f85" name="vlan21-if" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8684] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8685] audit: op="connection-add" uuid="35f4b4a1-40b0-4f1f-839b-02fcb2bd5bda" name="vlan22-if" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8694] audit: op="connection-delete" uuid="80b55774-fe11-3aa1-9b8b-d743e1ee863f" name="Wired connection 1" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8703] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8704] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8709] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8712] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (76f3c824-e4b5-4103-9334-3f6bfd42c218)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8712] audit: op="connection-activate" uuid="76f3c824-e4b5-4103-9334-3f6bfd42c218" name="br-ex-br" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8713] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8714] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8718] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8721] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (edb0f0f2-926d-4d67-8e3f-13d6a9af18f0)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8722] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8723] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8727] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8730] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (533851a8-9ff3-4f22-8615-086bbfda2345)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8731] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8732] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8735] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8738] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (3a364d80-f698-43a4-a081-0b80cdde34c5)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8739] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8740] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8743] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8746] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (30d3a37c-9bb1-4fe8-91e1-cf7c9b35f06d)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8747] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8748] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8751] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8754] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (c3c074f4-73c8-46ba-b76a-82cf4b218e39)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8755] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8756] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8758] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8762] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8763] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8765] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8768] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (4f331a68-ef25-4cc3-b893-72f660d5a76e)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8768] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8771] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8772] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8773] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8773] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8782] device (eth1): disconnecting for new activation request.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8782] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8784] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8786] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8786] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8788] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8789] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8792] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8795] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (fd94e02e-9184-4c1a-b25b-dc1d536763fc)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8795] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8797] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8799] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8800] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8802] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8803] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8805] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8808] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (73922ef6-38d9-490d-95d0-413297807f85)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8809] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8811] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8812] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8814] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8816] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <warn>  [1768224711.8817] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8819] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8822] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (35f4b4a1-40b0-4f1f-839b-02fcb2bd5bda)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8823] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8825] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8826] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8827] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8828] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8837] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.may-fail,ipv6.addr-gen-mode,ipv6.method,ipv6.routes,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,connection.autoconnect-priority" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8839] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8841] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8842] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8851] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8854] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8857] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 kernel: ovs-system: entered promiscuous mode
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8859] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8860] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8863] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8867] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8871] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8873] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8876] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 kernel: Timeout policy base is empty
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8880] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8882] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8883] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8886] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8890] dhcp4 (eth0): canceled DHCP transaction
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8890] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8890] dhcp4 (eth0): state changed no lease
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8891] dhcp6 (eth0): canceled DHCP transaction
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8891] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8891] dhcp6 (eth0): state changed no lease
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8894] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 12 13:31:51 compute-0 systemd-udevd[57994]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8901] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8906] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57990 uid=0 result="fail" reason="Device is not activated"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8909] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8914] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8919] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8921] dhcp4 (eth0): state changed new lease, address=192.168.25.114
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8962] device (eth1): disconnecting for new activation request.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.8963] audit: op="connection-activate" uuid="cd22dd6a-6b6d-5d12-97bb-b6ad4ed87c99" name="ci-private-network" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9042] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57990 uid=0 result="success"
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9066] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 12 13:31:51 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 12 13:31:51 compute-0 kernel: br-ex: entered promiscuous mode
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9107] device (eth1): Activation: starting connection 'ci-private-network' (cd22dd6a-6b6d-5d12-97bb-b6ad4ed87c99)
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9110] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9122] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9125] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9128] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9131] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9137] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9138] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9142] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9143] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9146] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9151] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9160] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9163] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9165] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9168] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9171] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9173] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9176] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9190] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 12 13:31:51 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9202] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9204] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9210] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9212] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9217] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9225] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 kernel: vlan22: entered promiscuous mode
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9289] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9293] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9299] device (eth1): Activation: successful, device activated.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9311] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9311] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9317] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9323] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 12 13:31:51 compute-0 kernel: vlan21: entered promiscuous mode
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9351] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9398] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9407] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9412] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 kernel: vlan20: entered promiscuous mode
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9428] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9434] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9445] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9450] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9456] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9491] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9500] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9522] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9525] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 12 13:31:51 compute-0 NetworkManager[55211]: <info>  [1768224711.9532] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.0361] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.1276] checkpoint[0x563d97a90950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.1277] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.2222] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.2231] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.3514] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.4412] checkpoint[0x563d97a90a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.4415] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.6466] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.6475] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.7879] audit: op="networking-control" arg="global-dns-configuration" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.7899] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf)
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.7903] audit: op="networking-control" arg="global-dns-configuration" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.7925] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.8923] checkpoint[0x563d97a90af0]: destroy /org/freedesktop/NetworkManager/Checkpoint/3
Jan 12 13:31:53 compute-0 NetworkManager[55211]: <info>  [1768224713.8925] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=57990 uid=0 result="success"
Jan 12 13:31:53 compute-0 sudo[58322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixxwrmugflnxaghrimctsywivwovtwiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224713.5526762-290-84356791425791/AnsiballZ_async_status.py'
Jan 12 13:31:53 compute-0 sudo[58322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:53 compute-0 ansible-async_wrapper.py[57988]: Module complete (57988)
Jan 12 13:31:54 compute-0 python3.9[58324]: ansible-ansible.legacy.async_status Invoked with jid=j224852308323.57984 mode=status _async_dir=/root/.ansible_async
Jan 12 13:31:54 compute-0 sudo[58322]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:54 compute-0 sudo[58421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvopgxkskwqswhoiexrdbrrovvhhzani ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224713.5526762-290-84356791425791/AnsiballZ_async_status.py'
Jan 12 13:31:54 compute-0 sudo[58421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:54 compute-0 python3.9[58423]: ansible-ansible.legacy.async_status Invoked with jid=j224852308323.57984 mode=cleanup _async_dir=/root/.ansible_async
Jan 12 13:31:54 compute-0 sudo[58421]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:54 compute-0 sudo[58574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khcqvizuernflqlzilmlfxpuxjjgwokn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224714.5632014-312-8597061813210/AnsiballZ_stat.py'
Jan 12 13:31:54 compute-0 sudo[58574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:54 compute-0 python3.9[58576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:31:54 compute-0 sudo[58574]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:55 compute-0 sudo[58697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzsjxoaipuotgnclyflofheiqzabmaip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224714.5632014-312-8597061813210/AnsiballZ_copy.py'
Jan 12 13:31:55 compute-0 sudo[58697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:55 compute-0 python3.9[58699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224714.5632014-312-8597061813210/.source.returncode _original_basename=.pagcl7g2 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:55 compute-0 sudo[58697]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:55 compute-0 ansible-async_wrapper.py[57987]: Done in kid B.
Jan 12 13:31:55 compute-0 sudo[58849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asyqtplwwneyodvdoucjepggzjqldrre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224715.5088825-328-216097004758572/AnsiballZ_stat.py'
Jan 12 13:31:55 compute-0 sudo[58849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:55 compute-0 python3.9[58851]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:31:55 compute-0 sudo[58849]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:56 compute-0 sudo[58972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlqdpvkvuvwcryzvpmthtoztmchvezsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224715.5088825-328-216097004758572/AnsiballZ_copy.py'
Jan 12 13:31:56 compute-0 sudo[58972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:56 compute-0 python3.9[58974]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224715.5088825-328-216097004758572/.source.cfg _original_basename=.ywv506i2 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:31:56 compute-0 sudo[58972]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:56 compute-0 sudo[59124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnhrmrlyzrpknrgozffayglegnrgufmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224716.368358-343-87254892326370/AnsiballZ_systemd.py'
Jan 12 13:31:56 compute-0 sudo[59124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:31:56 compute-0 python3.9[59126]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:31:56 compute-0 systemd[1]: Reloading Network Manager...
Jan 12 13:31:56 compute-0 NetworkManager[55211]: <info>  [1768224716.8353] audit: op="reload" arg="0" pid=59130 uid=0 result="success"
Jan 12 13:31:56 compute-0 NetworkManager[55211]: <info>  [1768224716.8359] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 12 13:31:56 compute-0 NetworkManager[55211]: <info>  [1768224716.8360] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 12 13:31:56 compute-0 systemd[1]: Reloaded Network Manager.
Jan 12 13:31:56 compute-0 sudo[59124]: pam_unix(sudo:session): session closed for user root
Jan 12 13:31:57 compute-0 sshd-session[51215]: Connection closed by 192.168.122.30 port 40872
Jan 12 13:31:57 compute-0 sshd-session[51212]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:31:57 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Jan 12 13:31:57 compute-0 systemd[1]: session-11.scope: Consumed 35.528s CPU time.
Jan 12 13:31:57 compute-0 systemd-logind[775]: Session 11 logged out. Waiting for processes to exit.
Jan 12 13:31:57 compute-0 systemd-logind[775]: Removed session 11.
Jan 12 13:32:03 compute-0 sshd-session[59161]: Accepted publickey for zuul from 192.168.122.30 port 41262 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:32:03 compute-0 systemd-logind[775]: New session 12 of user zuul.
Jan 12 13:32:03 compute-0 systemd[1]: Started Session 12 of User zuul.
Jan 12 13:32:03 compute-0 sshd-session[59161]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:32:03 compute-0 python3.9[59314]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:32:04 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 12 13:32:04 compute-0 python3.9[59469]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:32:05 compute-0 python3.9[59660]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:32:05 compute-0 sshd-session[59164]: Connection closed by 192.168.122.30 port 41262
Jan 12 13:32:05 compute-0 sshd-session[59161]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:32:05 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Jan 12 13:32:05 compute-0 systemd[1]: session-12.scope: Consumed 1.577s CPU time.
Jan 12 13:32:05 compute-0 systemd-logind[775]: Session 12 logged out. Waiting for processes to exit.
Jan 12 13:32:05 compute-0 systemd-logind[775]: Removed session 12.
Jan 12 13:32:06 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 12 13:32:10 compute-0 sshd-session[59688]: Accepted publickey for zuul from 192.168.122.30 port 41268 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:32:10 compute-0 systemd-logind[775]: New session 13 of user zuul.
Jan 12 13:32:10 compute-0 systemd[1]: Started Session 13 of User zuul.
Jan 12 13:32:10 compute-0 sshd-session[59688]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:32:11 compute-0 python3.9[59841]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:32:12 compute-0 python3.9[59995]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:32:12 compute-0 sudo[60149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkqfpxgxjdkqebfmqbtbtsxbamibtdkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224732.5396006-35-83385428899408/AnsiballZ_setup.py'
Jan 12 13:32:12 compute-0 sudo[60149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:12 compute-0 python3.9[60151]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:32:13 compute-0 sudo[60149]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:13 compute-0 sudo[60233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axcwfwadrrsmfffguwdgodrqygoqmkrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224732.5396006-35-83385428899408/AnsiballZ_dnf.py'
Jan 12 13:32:13 compute-0 sudo[60233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:13 compute-0 python3.9[60235]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:32:14 compute-0 sudo[60233]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:14 compute-0 sudo[60387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktpnnlkmiobnunfapdadwigsllnhxofn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224734.58681-47-159985582167917/AnsiballZ_setup.py'
Jan 12 13:32:14 compute-0 sudo[60387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:14 compute-0 python3.9[60389]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:32:15 compute-0 sudo[60387]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:15 compute-0 sudo[60578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mckyulcogbbslmzrpaahisggpcvpwodn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224735.2819085-58-201186523162622/AnsiballZ_file.py'
Jan 12 13:32:15 compute-0 sudo[60578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:15 compute-0 python3.9[60580]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:15 compute-0 sudo[60578]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:16 compute-0 sudo[60730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruqtpgntejkpwynpafsfinbjtkphaunc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224735.8123271-66-45257694270034/AnsiballZ_command.py'
Jan 12 13:32:16 compute-0 sudo[60730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:16 compute-0 python3.9[60732]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:32:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:32:16 compute-0 sudo[60730]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:16 compute-0 sudo[60892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncyufthgqnrjykgjxhcslstievxppiwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224736.462649-74-131164684599480/AnsiballZ_stat.py'
Jan 12 13:32:16 compute-0 sudo[60892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:16 compute-0 python3.9[60894]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:16 compute-0 sudo[60892]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:17 compute-0 sudo[60970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlevjsmdoyfcohfmdcgesgulwmstwwjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224736.462649-74-131164684599480/AnsiballZ_file.py'
Jan 12 13:32:17 compute-0 sudo[60970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:17 compute-0 python3.9[60972]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:17 compute-0 sudo[60970]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:17 compute-0 sudo[61122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnslqsuxeafewxoxxbqsxrukwdjinzcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224737.3108644-86-182347950193891/AnsiballZ_stat.py'
Jan 12 13:32:17 compute-0 sudo[61122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:17 compute-0 python3.9[61124]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:17 compute-0 sudo[61122]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:17 compute-0 sudo[61200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymuernjiyfbeeqyzjzmjfqhhcqeuojzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224737.3108644-86-182347950193891/AnsiballZ_file.py'
Jan 12 13:32:17 compute-0 sudo[61200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:17 compute-0 python3.9[61202]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:32:17 compute-0 sudo[61200]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:18 compute-0 sudo[61352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltzbkeluupfcojmsgxrhfoumkxecozpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224738.093663-99-278315565542942/AnsiballZ_ini_file.py'
Jan 12 13:32:18 compute-0 sudo[61352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:18 compute-0 python3.9[61354]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:32:18 compute-0 sudo[61352]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:18 compute-0 sudo[61504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btkhsiprzmrbnbkhhhpnrkqeelfhjsbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224738.6337376-99-268969119948216/AnsiballZ_ini_file.py'
Jan 12 13:32:18 compute-0 sudo[61504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:18 compute-0 python3.9[61506]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:32:18 compute-0 sudo[61504]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:19 compute-0 sudo[61656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mybgvyruhxwycrpfbnxhfdwydtuzngms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224739.036326-99-91858498063214/AnsiballZ_ini_file.py'
Jan 12 13:32:19 compute-0 sudo[61656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:19 compute-0 python3.9[61658]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:32:19 compute-0 sudo[61656]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:19 compute-0 sudo[61808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aavvxczhsupflkwpkkjscmeilxtpfwex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224739.450642-99-177281294882783/AnsiballZ_ini_file.py'
Jan 12 13:32:19 compute-0 sudo[61808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:19 compute-0 python3.9[61810]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:32:19 compute-0 sudo[61808]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:20 compute-0 sudo[61961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukkmpyqnkvdxexehwjpccnjvzfjdmilh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224739.9249678-130-226531909629368/AnsiballZ_dnf.py'
Jan 12 13:32:20 compute-0 sudo[61961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:20 compute-0 python3.9[61963]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:32:21 compute-0 sudo[61961]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:21 compute-0 sudo[62114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nudbmmsyqqockxqtahoiisdiuapawork ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224741.4647772-141-71731467440489/AnsiballZ_setup.py'
Jan 12 13:32:21 compute-0 sudo[62114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:21 compute-0 python3.9[62116]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:32:21 compute-0 sudo[62114]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:22 compute-0 sudo[62268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmbusmatmhifffegiqkgkhnwjgbvkawf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224742.0119107-149-129551854918158/AnsiballZ_stat.py'
Jan 12 13:32:22 compute-0 sudo[62268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:22 compute-0 python3.9[62270]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:32:22 compute-0 sudo[62268]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:22 compute-0 sudo[62420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlowonfsyynkofvetgrypedzfzzhhjrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224742.4717062-158-64910509732312/AnsiballZ_stat.py'
Jan 12 13:32:22 compute-0 sudo[62420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:22 compute-0 python3.9[62422]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:32:22 compute-0 sudo[62420]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:23 compute-0 sudo[62573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muykxupznzalysvgfjydfjteoghxdnpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224742.9736598-168-33972381545723/AnsiballZ_command.py'
Jan 12 13:32:23 compute-0 sudo[62573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:23 compute-0 python3.9[62575]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:32:23 compute-0 sudo[62573]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:23 compute-0 sudo[62726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-westvtqlrdhmffsllmlckynefuacdnwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224743.4656806-178-218076535488880/AnsiballZ_service_facts.py'
Jan 12 13:32:23 compute-0 sudo[62726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:23 compute-0 python3.9[62728]: ansible-service_facts Invoked
Jan 12 13:32:23 compute-0 network[62745]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 12 13:32:23 compute-0 network[62746]: 'network-scripts' will be removed from distribution in near future.
Jan 12 13:32:23 compute-0 network[62747]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 12 13:32:25 compute-0 sudo[62726]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:26 compute-0 sudo[63030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbgkerdixqhktoctxdlvnabkppxntnkh ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1768224746.0817702-193-44116586344165/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1768224746.0817702-193-44116586344165/args'
Jan 12 13:32:26 compute-0 sudo[63030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:26 compute-0 sudo[63030]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:26 compute-0 sudo[63197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rztanusjaezrojdjyxbjtuwuklqireun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224746.493788-204-188382591421894/AnsiballZ_dnf.py'
Jan 12 13:32:26 compute-0 sudo[63197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:26 compute-0 python3.9[63199]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:32:27 compute-0 sudo[63197]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:28 compute-0 sudo[63350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjxvlcieoxwvxbonqttcoomargcgadec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224747.9488745-217-90577180506497/AnsiballZ_package_facts.py'
Jan 12 13:32:28 compute-0 sudo[63350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:28 compute-0 python3.9[63352]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 12 13:32:28 compute-0 sudo[63350]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:29 compute-0 sudo[63502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmxwgaomklyqctsoxuxetpsnnsjknkfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224749.0416913-227-236936720155254/AnsiballZ_stat.py'
Jan 12 13:32:29 compute-0 sudo[63502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:29 compute-0 python3.9[63504]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:29 compute-0 sudo[63502]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:29 compute-0 sudo[63627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvjadzryhmzqloctejmobnqebwqltuzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224749.0416913-227-236936720155254/AnsiballZ_copy.py'
Jan 12 13:32:29 compute-0 sudo[63627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:30 compute-0 python3.9[63629]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224749.0416913-227-236936720155254/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:30 compute-0 sudo[63627]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:30 compute-0 sudo[63781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riqlctslgqbekhhrughcxmoxotbvfymq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224750.1754422-242-206030022998649/AnsiballZ_stat.py'
Jan 12 13:32:30 compute-0 sudo[63781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:30 compute-0 python3.9[63783]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:30 compute-0 sudo[63781]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:30 compute-0 sudo[63906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvzchanipjjibupdfuxhfwgawajcuoqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224750.1754422-242-206030022998649/AnsiballZ_copy.py'
Jan 12 13:32:30 compute-0 sudo[63906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:30 compute-0 python3.9[63908]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224750.1754422-242-206030022998649/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:30 compute-0 sudo[63906]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:31 compute-0 sudo[64060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtnklguenclaibvqzvggricaoeufprme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224751.2344608-263-177100590749192/AnsiballZ_lineinfile.py'
Jan 12 13:32:31 compute-0 sudo[64060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:31 compute-0 python3.9[64062]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:31 compute-0 sudo[64060]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:32 compute-0 sudo[64214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptfowxorlnjiyefexsqhgysggophpoqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224752.156405-278-247939763480541/AnsiballZ_setup.py'
Jan 12 13:32:32 compute-0 sudo[64214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:32 compute-0 python3.9[64216]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:32:32 compute-0 sudo[64214]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:33 compute-0 sudo[64298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iflhydkuywfqmibqmdffzepenaspkxgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224752.156405-278-247939763480541/AnsiballZ_systemd.py'
Jan 12 13:32:33 compute-0 sudo[64298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:33 compute-0 python3.9[64300]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:32:33 compute-0 sudo[64298]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:33 compute-0 sudo[64452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udllvjoxzptjvcbfpjflgigybrxhsame ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224753.7414525-294-59444996487450/AnsiballZ_setup.py'
Jan 12 13:32:33 compute-0 sudo[64452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:34 compute-0 python3.9[64454]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:32:34 compute-0 sudo[64452]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:34 compute-0 sudo[64536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnwloahfdirgxhjaustpyfllbzdjiebe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224753.7414525-294-59444996487450/AnsiballZ_systemd.py'
Jan 12 13:32:34 compute-0 sudo[64536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:34 compute-0 python3.9[64538]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:32:34 compute-0 chronyd[783]: chronyd exiting
Jan 12 13:32:34 compute-0 systemd[1]: Stopping NTP client/server...
Jan 12 13:32:34 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Jan 12 13:32:34 compute-0 systemd[1]: Stopped NTP client/server.
Jan 12 13:32:34 compute-0 systemd[1]: Starting NTP client/server...
Jan 12 13:32:34 compute-0 chronyd[64547]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 12 13:32:34 compute-0 chronyd[64547]: Frequency -4.768 +/- 0.693 ppm read from /var/lib/chrony/drift
Jan 12 13:32:34 compute-0 chronyd[64547]: Loaded seccomp filter (level 2)
Jan 12 13:32:34 compute-0 systemd[1]: Started NTP client/server.
Jan 12 13:32:34 compute-0 sudo[64536]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:35 compute-0 sshd-session[59691]: Connection closed by 192.168.122.30 port 41268
Jan 12 13:32:35 compute-0 sshd-session[59688]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:32:35 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Jan 12 13:32:35 compute-0 systemd[1]: session-13.scope: Consumed 16.913s CPU time.
Jan 12 13:32:35 compute-0 systemd-logind[775]: Session 13 logged out. Waiting for processes to exit.
Jan 12 13:32:35 compute-0 systemd-logind[775]: Removed session 13.
Jan 12 13:32:39 compute-0 sshd-session[64573]: Accepted publickey for zuul from 192.168.122.30 port 34204 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:32:39 compute-0 systemd-logind[775]: New session 14 of user zuul.
Jan 12 13:32:39 compute-0 systemd[1]: Started Session 14 of User zuul.
Jan 12 13:32:39 compute-0 sshd-session[64573]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:32:40 compute-0 python3.9[64726]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:32:41 compute-0 sudo[64880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yniyufxfldopqsnelktmuvjyeeumksto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224760.883712-28-131737897055311/AnsiballZ_file.py'
Jan 12 13:32:41 compute-0 sudo[64880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:41 compute-0 python3.9[64882]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:41 compute-0 sudo[64880]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:41 compute-0 sudo[65055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utjvbtlqxndnlmdilodnnsrnlesjtcec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224761.4313161-36-116070462696358/AnsiballZ_stat.py'
Jan 12 13:32:41 compute-0 sudo[65055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:41 compute-0 python3.9[65057]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:41 compute-0 sudo[65055]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:42 compute-0 sudo[65133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzknbiikzhejyywvmgypkpbdualgdicx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224761.4313161-36-116070462696358/AnsiballZ_file.py'
Jan 12 13:32:42 compute-0 sudo[65133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:42 compute-0 python3.9[65135]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.0hqxgnl3 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:42 compute-0 sudo[65133]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:42 compute-0 sudo[65285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckghqhgdgyscmjdkrowrikjqmllucuef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224762.573548-56-114685648934190/AnsiballZ_stat.py'
Jan 12 13:32:42 compute-0 sudo[65285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:42 compute-0 python3.9[65287]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:42 compute-0 sudo[65285]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:43 compute-0 sudo[65408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxeyxmgwqstctfyhuwkikzfcpuzygizb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224762.573548-56-114685648934190/AnsiballZ_copy.py'
Jan 12 13:32:43 compute-0 sudo[65408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:43 compute-0 python3.9[65410]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224762.573548-56-114685648934190/.source _original_basename=.p7_mmq1_ follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:43 compute-0 sudo[65408]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:43 compute-0 sudo[65560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvpbiodbzikuswiflblmsulmcmduwcti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224763.510375-72-66174260313471/AnsiballZ_file.py'
Jan 12 13:32:43 compute-0 sudo[65560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:43 compute-0 python3.9[65562]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:32:43 compute-0 sudo[65560]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:44 compute-0 sudo[65712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxmdvslpiuezwnbknwuqvevncpwifdub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224763.913089-80-68259255941982/AnsiballZ_stat.py'
Jan 12 13:32:44 compute-0 sudo[65712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:44 compute-0 python3.9[65714]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:44 compute-0 sudo[65712]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:44 compute-0 sudo[65835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfxwtpgzkasmhrpquuowzzsknrsokxqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224763.913089-80-68259255941982/AnsiballZ_copy.py'
Jan 12 13:32:44 compute-0 sudo[65835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:44 compute-0 python3.9[65837]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224763.913089-80-68259255941982/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:32:44 compute-0 sudo[65835]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:44 compute-0 sudo[65987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrnqrdhuowvhmjavozvqazjmxhdxlhxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224764.629192-80-273143231049497/AnsiballZ_stat.py'
Jan 12 13:32:44 compute-0 sudo[65987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:44 compute-0 python3.9[65989]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:44 compute-0 sudo[65987]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:45 compute-0 sudo[66110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nolwoxcijqiolhjuanvikvwzuixeffus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224764.629192-80-273143231049497/AnsiballZ_copy.py'
Jan 12 13:32:45 compute-0 sudo[66110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:45 compute-0 python3.9[66112]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224764.629192-80-273143231049497/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:32:45 compute-0 sudo[66110]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:45 compute-0 sudo[66262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrqscczodsutqrhocdtpbvjcuytvubtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224765.4059136-109-111029413938895/AnsiballZ_file.py'
Jan 12 13:32:45 compute-0 sudo[66262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:45 compute-0 python3.9[66264]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:45 compute-0 sudo[66262]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:45 compute-0 sudo[66414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfgvjvenxmmgsliujkdjmnwsjdmggpui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224765.8283994-117-53343509977661/AnsiballZ_stat.py'
Jan 12 13:32:45 compute-0 sudo[66414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:46 compute-0 python3.9[66416]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:46 compute-0 sudo[66414]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:46 compute-0 sudo[66537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vajaiqgnssswsknuinsywpofesklgbzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224765.8283994-117-53343509977661/AnsiballZ_copy.py'
Jan 12 13:32:46 compute-0 sudo[66537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:46 compute-0 python3.9[66539]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224765.8283994-117-53343509977661/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:46 compute-0 sudo[66537]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:46 compute-0 sudo[66689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvluepeazclqcailgpnhmubelfnuxmjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224766.6044946-132-234235422825724/AnsiballZ_stat.py'
Jan 12 13:32:46 compute-0 sudo[66689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:46 compute-0 python3.9[66691]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:46 compute-0 sudo[66689]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:47 compute-0 sudo[66812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niyghmegusqqmolepbptfyzodokolyvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224766.6044946-132-234235422825724/AnsiballZ_copy.py'
Jan 12 13:32:47 compute-0 sudo[66812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:47 compute-0 python3.9[66814]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224766.6044946-132-234235422825724/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:47 compute-0 sudo[66812]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:47 compute-0 sudo[66964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcyfvzedhhlqoejeofelmwughjpkgrzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224767.4950397-147-101542963639045/AnsiballZ_systemd.py'
Jan 12 13:32:47 compute-0 sudo[66964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:48 compute-0 python3.9[66966]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:32:48 compute-0 systemd[1]: Reloading.
Jan 12 13:32:48 compute-0 systemd-sysv-generator[66993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:32:48 compute-0 systemd-rc-local-generator[66989]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:32:48 compute-0 systemd[1]: Reloading.
Jan 12 13:32:48 compute-0 systemd-rc-local-generator[67024]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:32:48 compute-0 systemd-sysv-generator[67027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:32:48 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Jan 12 13:32:48 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Jan 12 13:32:48 compute-0 sudo[66964]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:48 compute-0 sudo[67191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leggvojkbfuatwsesdneldcpbsvszmsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224768.6662266-155-109985504888063/AnsiballZ_stat.py'
Jan 12 13:32:48 compute-0 sudo[67191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:48 compute-0 python3.9[67193]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:48 compute-0 sudo[67191]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:49 compute-0 sudo[67314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwmkefxakxdjmpmgdolyzulpmdpgshix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224768.6662266-155-109985504888063/AnsiballZ_copy.py'
Jan 12 13:32:49 compute-0 sudo[67314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:49 compute-0 python3.9[67316]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224768.6662266-155-109985504888063/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:49 compute-0 sudo[67314]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:49 compute-0 sudo[67466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gksglbprvfnkpfrvzriyjakltyrhqwpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224769.4964144-170-209146452477509/AnsiballZ_stat.py'
Jan 12 13:32:49 compute-0 sudo[67466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:49 compute-0 python3.9[67468]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:49 compute-0 sudo[67466]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:50 compute-0 sudo[67589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tspjtvfziitoyholrswdloijgsgsuyqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224769.4964144-170-209146452477509/AnsiballZ_copy.py'
Jan 12 13:32:50 compute-0 sudo[67589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:50 compute-0 python3.9[67591]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224769.4964144-170-209146452477509/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:50 compute-0 sudo[67589]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:50 compute-0 sudo[67741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pocbtwetwhkecplhuajslrnnjwsgjmtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224770.3230748-185-59173477549586/AnsiballZ_systemd.py'
Jan 12 13:32:50 compute-0 sudo[67741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:50 compute-0 python3.9[67743]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:32:50 compute-0 systemd[1]: Reloading.
Jan 12 13:32:50 compute-0 systemd-rc-local-generator[67763]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:32:50 compute-0 systemd-sysv-generator[67767]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:32:50 compute-0 systemd[1]: Reloading.
Jan 12 13:32:51 compute-0 systemd-sysv-generator[67805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:32:51 compute-0 systemd-rc-local-generator[67802]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:32:51 compute-0 systemd[1]: Starting Create netns directory...
Jan 12 13:32:51 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 12 13:32:51 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 12 13:32:51 compute-0 systemd[1]: Finished Create netns directory.
Jan 12 13:32:51 compute-0 sudo[67741]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:51 compute-0 python3.9[67968]: ansible-ansible.builtin.service_facts Invoked
Jan 12 13:32:51 compute-0 network[67985]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 12 13:32:51 compute-0 network[67986]: 'network-scripts' will be removed from distribution in near future.
Jan 12 13:32:51 compute-0 network[67987]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 12 13:32:53 compute-0 sudo[68247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynnybqbmvlmgtciwcmzgcvrplpevqfdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224773.6212733-201-253222548065278/AnsiballZ_systemd.py'
Jan 12 13:32:53 compute-0 sudo[68247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:54 compute-0 python3.9[68249]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:32:54 compute-0 systemd[1]: Reloading.
Jan 12 13:32:54 compute-0 systemd-rc-local-generator[68275]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:32:54 compute-0 systemd-sysv-generator[68278]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:32:54 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 12 13:32:54 compute-0 iptables.init[68288]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 12 13:32:54 compute-0 iptables.init[68288]: iptables: Flushing firewall rules: [  OK  ]
Jan 12 13:32:54 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Jan 12 13:32:54 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 12 13:32:54 compute-0 sudo[68247]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:54 compute-0 sudo[68482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcgotodwykoemjaydhokbyizdsmxhjkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224774.6253436-201-27900031653481/AnsiballZ_systemd.py'
Jan 12 13:32:54 compute-0 sudo[68482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:55 compute-0 python3.9[68484]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:32:55 compute-0 sudo[68482]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:55 compute-0 sudo[68636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwnkqxoosjhcobbnpjmzyuwrrlysncjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224775.2609298-217-62700540731452/AnsiballZ_systemd.py'
Jan 12 13:32:55 compute-0 sudo[68636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:55 compute-0 python3.9[68638]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:32:55 compute-0 systemd[1]: Reloading.
Jan 12 13:32:55 compute-0 systemd-rc-local-generator[68661]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:32:55 compute-0 systemd-sysv-generator[68664]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:32:55 compute-0 systemd[1]: Starting Netfilter Tables...
Jan 12 13:32:55 compute-0 systemd[1]: Finished Netfilter Tables.
Jan 12 13:32:55 compute-0 sudo[68636]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:56 compute-0 sudo[68827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xufvjhavbcafyehqelhzxqryqbrvwofu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224776.04665-225-140964329731906/AnsiballZ_command.py'
Jan 12 13:32:56 compute-0 sudo[68827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:56 compute-0 python3.9[68829]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:32:56 compute-0 sudo[68827]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:57 compute-0 sudo[68980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbeoiglplufesjfufelhaonnlbnycjpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224776.9824576-239-184983146507264/AnsiballZ_stat.py'
Jan 12 13:32:57 compute-0 sudo[68980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:57 compute-0 python3.9[68982]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:57 compute-0 sudo[68980]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:57 compute-0 sudo[69105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svjjpjjgtvmukuborfpllgclyrvppapv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224776.9824576-239-184983146507264/AnsiballZ_copy.py'
Jan 12 13:32:57 compute-0 sudo[69105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:57 compute-0 python3.9[69107]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224776.9824576-239-184983146507264/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:57 compute-0 sudo[69105]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:58 compute-0 sudo[69258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qovupdymooafjtfnynwozswfhyebvxjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224777.8532848-254-4003883012425/AnsiballZ_systemd.py'
Jan 12 13:32:58 compute-0 sudo[69258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:58 compute-0 python3.9[69260]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:32:58 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Jan 12 13:32:58 compute-0 sshd[963]: Received SIGHUP; restarting.
Jan 12 13:32:58 compute-0 sshd[963]: Server listening on 0.0.0.0 port 22.
Jan 12 13:32:58 compute-0 sshd[963]: Server listening on :: port 22.
Jan 12 13:32:58 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Jan 12 13:32:58 compute-0 sudo[69258]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:58 compute-0 sudo[69414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syvglqbngwzauwujtmpwyxtxespuetny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224778.4576957-262-126443823188392/AnsiballZ_file.py'
Jan 12 13:32:58 compute-0 sudo[69414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:58 compute-0 python3.9[69416]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:58 compute-0 sudo[69414]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:59 compute-0 sudo[69566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfchvptqjfdibtqjvddxshhhymryfkuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224778.915055-270-252263437805168/AnsiballZ_stat.py'
Jan 12 13:32:59 compute-0 sudo[69566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:59 compute-0 python3.9[69568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:32:59 compute-0 sudo[69566]: pam_unix(sudo:session): session closed for user root
Jan 12 13:32:59 compute-0 sudo[69689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euqfcpzaclmnooalvvqyxcajwjxovcxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224778.915055-270-252263437805168/AnsiballZ_copy.py'
Jan 12 13:32:59 compute-0 sudo[69689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:32:59 compute-0 python3.9[69691]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224778.915055-270-252263437805168/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:32:59 compute-0 sudo[69689]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:00 compute-0 sudo[69841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjlnnblekksgoktydmdodajoupjyxulh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224779.8238547-288-58097275486718/AnsiballZ_timezone.py'
Jan 12 13:33:00 compute-0 sudo[69841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:00 compute-0 python3.9[69843]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 12 13:33:00 compute-0 systemd[1]: Starting Time & Date Service...
Jan 12 13:33:00 compute-0 systemd[1]: Started Time & Date Service.
Jan 12 13:33:00 compute-0 sudo[69841]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:00 compute-0 sudo[69997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amxifpbyciigfgnaaqrmenmqmdsgjvdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224780.5270467-297-208304932492017/AnsiballZ_file.py'
Jan 12 13:33:00 compute-0 sudo[69997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:00 compute-0 python3.9[69999]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:00 compute-0 sudo[69997]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:01 compute-0 sudo[70149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mauwodtvvpfilbtoqktaxrrduxwxkgzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224781.1330569-305-174274530476682/AnsiballZ_stat.py'
Jan 12 13:33:01 compute-0 sudo[70149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:01 compute-0 python3.9[70151]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:01 compute-0 sudo[70149]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:01 compute-0 sudo[70272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abeikbhngtxgseihjbswnuobiyltjfpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224781.1330569-305-174274530476682/AnsiballZ_copy.py'
Jan 12 13:33:01 compute-0 sudo[70272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:01 compute-0 python3.9[70274]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224781.1330569-305-174274530476682/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:01 compute-0 sudo[70272]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:02 compute-0 sudo[70424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guhpslykaoxdeccxwuqcisfqvnsitvlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224782.083693-320-243338853091168/AnsiballZ_stat.py'
Jan 12 13:33:02 compute-0 sudo[70424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:02 compute-0 python3.9[70426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:02 compute-0 sudo[70424]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:02 compute-0 sudo[70547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chaqgnakautkqxqxujuiraugfmjzclvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224782.083693-320-243338853091168/AnsiballZ_copy.py'
Jan 12 13:33:02 compute-0 sudo[70547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:02 compute-0 python3.9[70549]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224782.083693-320-243338853091168/.source.yaml _original_basename=.5ga9ss5j follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:02 compute-0 sudo[70547]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:03 compute-0 sudo[70699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgsjryyqgfphfqjsokdgwswkqaspwldr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224783.026215-335-277190321671375/AnsiballZ_stat.py'
Jan 12 13:33:03 compute-0 sudo[70699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:03 compute-0 python3.9[70701]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:03 compute-0 sudo[70699]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:03 compute-0 sudo[70822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdmzfauzmmifczaqndzzupwcsiqtoimx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224783.026215-335-277190321671375/AnsiballZ_copy.py'
Jan 12 13:33:03 compute-0 sudo[70822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:03 compute-0 python3.9[70824]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224783.026215-335-277190321671375/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:03 compute-0 sudo[70822]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:04 compute-0 sudo[70974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytimorxchzhznvydfdigpqbgjqjbamoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224783.91941-350-97642693537320/AnsiballZ_command.py'
Jan 12 13:33:04 compute-0 sudo[70974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:04 compute-0 python3.9[70976]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:33:04 compute-0 sudo[70974]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:04 compute-0 sudo[71127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vinsaqrhycgrihscgwgnwrxergmkiiul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224784.4508035-358-151633419282470/AnsiballZ_command.py'
Jan 12 13:33:04 compute-0 sudo[71127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:04 compute-0 python3.9[71129]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:33:04 compute-0 sudo[71127]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:05 compute-0 sudo[71280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdasttseiaqebytnyxfwnzfymtavhkwv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768224784.9322789-366-26235750381163/AnsiballZ_edpm_nftables_from_files.py'
Jan 12 13:33:05 compute-0 sudo[71280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:05 compute-0 python3[71282]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 12 13:33:05 compute-0 sudo[71280]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:05 compute-0 sudo[71432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyypcwtafypuhwgrufznmrzcjsunqgqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224785.668874-374-218894788037877/AnsiballZ_stat.py'
Jan 12 13:33:05 compute-0 sudo[71432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:06 compute-0 python3.9[71434]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:06 compute-0 sudo[71432]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:06 compute-0 sudo[71555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlvbpgeqgaijnaoiyfixvznlzbeuderi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224785.668874-374-218894788037877/AnsiballZ_copy.py'
Jan 12 13:33:06 compute-0 sudo[71555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:06 compute-0 python3.9[71557]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224785.668874-374-218894788037877/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:06 compute-0 sudo[71555]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:06 compute-0 sudo[71707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veckszymtppqjikyeziwzmmheszkdieq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224786.517784-389-182950165287014/AnsiballZ_stat.py'
Jan 12 13:33:06 compute-0 sudo[71707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:06 compute-0 python3.9[71709]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:06 compute-0 sudo[71707]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:07 compute-0 sudo[71830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsrthzxcaoggytcnxgtvxxlcozgmawxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224786.517784-389-182950165287014/AnsiballZ_copy.py'
Jan 12 13:33:07 compute-0 sudo[71830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:07 compute-0 python3.9[71832]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224786.517784-389-182950165287014/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:07 compute-0 sudo[71830]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:07 compute-0 sudo[71982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kubmhiphotkhmfvfmufeefokgztgfsza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224787.395756-404-266417750188002/AnsiballZ_stat.py'
Jan 12 13:33:07 compute-0 sudo[71982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:07 compute-0 python3.9[71984]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:07 compute-0 sudo[71982]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:07 compute-0 sudo[72105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqxxqrpqmhzmcdplqahcwvepwtwomvij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224787.395756-404-266417750188002/AnsiballZ_copy.py'
Jan 12 13:33:07 compute-0 sudo[72105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:08 compute-0 python3.9[72107]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224787.395756-404-266417750188002/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:08 compute-0 sudo[72105]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:08 compute-0 sudo[72257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eacheeyivoejwsxavnewdggpqthhbrzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224788.2409742-419-107735528407658/AnsiballZ_stat.py'
Jan 12 13:33:08 compute-0 sudo[72257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:08 compute-0 python3.9[72259]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:08 compute-0 sudo[72257]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:08 compute-0 sudo[72380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgkwgazsopcsfqahfnknqwsznhpdsinv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224788.2409742-419-107735528407658/AnsiballZ_copy.py'
Jan 12 13:33:08 compute-0 sudo[72380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:08 compute-0 python3.9[72382]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224788.2409742-419-107735528407658/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:08 compute-0 sudo[72380]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:09 compute-0 sudo[72532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuofaskejfdxcyitwjxfhdkmrdfhfgme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224789.1328833-434-203944437116955/AnsiballZ_stat.py'
Jan 12 13:33:09 compute-0 sudo[72532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:09 compute-0 python3.9[72534]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:09 compute-0 sudo[72532]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:09 compute-0 sudo[72655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biuzlcibhgesbvewembmvcelgyjgtqxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224789.1328833-434-203944437116955/AnsiballZ_copy.py'
Jan 12 13:33:09 compute-0 sudo[72655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:10 compute-0 python3.9[72657]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224789.1328833-434-203944437116955/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:10 compute-0 sudo[72655]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:10 compute-0 sudo[72807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyhlsyptblgepitpvnalylrvcqzqepkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224790.184221-449-11160154519481/AnsiballZ_file.py'
Jan 12 13:33:10 compute-0 sudo[72807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:10 compute-0 python3.9[72809]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:10 compute-0 sudo[72807]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:10 compute-0 sudo[72959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrttespiaclyxiohwuhiolnacyhcqpic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224790.6561882-457-217491276361404/AnsiballZ_command.py'
Jan 12 13:33:10 compute-0 sudo[72959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:10 compute-0 python3.9[72961]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:33:11 compute-0 sudo[72959]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:11 compute-0 sudo[73118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvdhsllxplramwzpmbzftwsrhfxlplqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224791.1635432-465-274948836260255/AnsiballZ_blockinfile.py'
Jan 12 13:33:11 compute-0 sudo[73118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:11 compute-0 python3.9[73120]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:11 compute-0 sudo[73118]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:12 compute-0 sudo[73271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oymgptztonverhbftvvxxgnkzcdfvedv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224791.8602448-474-146528806613614/AnsiballZ_file.py'
Jan 12 13:33:12 compute-0 sudo[73271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:12 compute-0 python3.9[73273]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:12 compute-0 sudo[73271]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:12 compute-0 sudo[73423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vosccketuhbnfjhrmzmuodaxljqilnlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224792.305932-474-23087613326504/AnsiballZ_file.py'
Jan 12 13:33:12 compute-0 sudo[73423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:12 compute-0 python3.9[73425]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:12 compute-0 sudo[73423]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:13 compute-0 sudo[73575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adtzyqujwrsteiwjxnrgzgppqwguozyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224792.7771723-489-163455758835907/AnsiballZ_mount.py'
Jan 12 13:33:13 compute-0 sudo[73575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:13 compute-0 python3.9[73577]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 12 13:33:13 compute-0 sudo[73575]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:13 compute-0 sudo[73728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obznzrenabdjfchlwcepyhsfvxqfvhhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224793.407379-489-139292347285740/AnsiballZ_mount.py'
Jan 12 13:33:13 compute-0 sudo[73728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:13 compute-0 python3.9[73730]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 12 13:33:13 compute-0 sudo[73728]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:14 compute-0 sshd-session[64576]: Connection closed by 192.168.122.30 port 34204
Jan 12 13:33:14 compute-0 sshd-session[64573]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:33:14 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Jan 12 13:33:14 compute-0 systemd[1]: session-14.scope: Consumed 23.890s CPU time.
Jan 12 13:33:14 compute-0 systemd-logind[775]: Session 14 logged out. Waiting for processes to exit.
Jan 12 13:33:14 compute-0 systemd-logind[775]: Removed session 14.
Jan 12 13:33:19 compute-0 sshd-session[73756]: Accepted publickey for zuul from 192.168.122.30 port 44450 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:33:19 compute-0 systemd-logind[775]: New session 15 of user zuul.
Jan 12 13:33:19 compute-0 systemd[1]: Started Session 15 of User zuul.
Jan 12 13:33:19 compute-0 sshd-session[73756]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:33:19 compute-0 sudo[73909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujmjfsmwzzlzqgckzyydinwickftbkqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224799.1425226-16-150325799413243/AnsiballZ_tempfile.py'
Jan 12 13:33:19 compute-0 sudo[73909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:19 compute-0 python3.9[73911]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 12 13:33:19 compute-0 sudo[73909]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:19 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:33:19 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:33:20 compute-0 sudo[74062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgvaehkxacmcqyofkrbxfssdcnonneiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224799.7498906-28-116971129151332/AnsiballZ_stat.py'
Jan 12 13:33:20 compute-0 sudo[74062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:20 compute-0 python3.9[74064]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:33:20 compute-0 sudo[74062]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:20 compute-0 sudo[74214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuusfnfaftcwiogeegkwcsmofbmiviyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224800.3584104-38-167167429772934/AnsiballZ_setup.py'
Jan 12 13:33:20 compute-0 sudo[74214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:20 compute-0 python3.9[74216]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:33:21 compute-0 sudo[74214]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:21 compute-0 sudo[74366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-samzrrdourbdbehjvsbmibicmaxwrlur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224801.165144-47-226414550174218/AnsiballZ_blockinfile.py'
Jan 12 13:33:21 compute-0 sudo[74366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:21 compute-0 python3.9[74368]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCnIEHGMi2RZ2OEHO2Q5FhVb/Z0+KF7fuYQZTdkY1Q4AtCfE9EWDlDTGb5jrF1/M9JClvpadI9rrrEvmtQLj9rsbQnzMxNihrutKa2vhjX6zkD08lz1FZZ1deTFIGESSRScKyMar13KOXm02dGeD9vV5FQXB+Rdpew55ymE/19PrRdIS1h+Sor0a1EfMoTSBF6q133ajz0JPLy5h+q+92jA3nj587bGfYmkCdxG2sjVx6Q+NEfSK41J8bZoGbpHZYnxRy81ImP8ZuI8bRLI3RdOftTgqRacLCv0IDzicOQ6E+uQ3qRE+g5mcI59kuoymWi425b95uM1hZbKYOskZafvJy7Fxew3zgianiJBtCw7LWfMQR2IMbL3H/YwSdOa5eYex/w6Mmi5mvVxhqhDDjzg7YV/QOB4hU4dOAnwLz52mTIFJaosCJi4Y9DgXTeDwvJbe1iowWRTKIjEYiToHcUO7ncBS+kv0s35uaMGzN1w+BfzAUbuA4SbwU7YOQN8Ink=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIETjz2PdMuAs35EpiJWxfQmUXhD7zrjndwfDsxpLm5/s
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEEnVrwEhMzl9Lt3EBTy76FoNI4fc5GrBCKsxAC3EJGSUJZk6Khb8kfib4QPE7nisiw94TISWOhTtbLKrZ+gHWc=
                                             create=True mode=0644 path=/tmp/ansible.490iriqj state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:21 compute-0 sudo[74366]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:21 compute-0 sudo[74518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caprdyzetpzwpicnaltoeczkmwfnwtmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224801.7234993-55-207827642632899/AnsiballZ_command.py'
Jan 12 13:33:21 compute-0 sudo[74518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:22 compute-0 python3.9[74520]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.490iriqj' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:33:22 compute-0 sudo[74518]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:22 compute-0 sudo[74672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkcxoczfmqcwuxglestisouoedamrztl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224802.2878804-63-47744892001707/AnsiballZ_file.py'
Jan 12 13:33:22 compute-0 sudo[74672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:22 compute-0 python3.9[74674]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.490iriqj state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:22 compute-0 sudo[74672]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:23 compute-0 sshd-session[73759]: Connection closed by 192.168.122.30 port 44450
Jan 12 13:33:23 compute-0 sshd-session[73756]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:33:23 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Jan 12 13:33:23 compute-0 systemd[1]: session-15.scope: Consumed 2.329s CPU time.
Jan 12 13:33:23 compute-0 systemd-logind[775]: Session 15 logged out. Waiting for processes to exit.
Jan 12 13:33:23 compute-0 systemd-logind[775]: Removed session 15.
Jan 12 13:33:28 compute-0 sshd-session[74699]: Accepted publickey for zuul from 192.168.122.30 port 47928 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:33:28 compute-0 systemd-logind[775]: New session 16 of user zuul.
Jan 12 13:33:28 compute-0 systemd[1]: Started Session 16 of User zuul.
Jan 12 13:33:28 compute-0 sshd-session[74699]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:33:29 compute-0 python3.9[74852]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:33:29 compute-0 sudo[75006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyzwlydbkpihvobsnroqaixyxzubdptf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224809.327963-27-271068601387899/AnsiballZ_systemd.py'
Jan 12 13:33:29 compute-0 sudo[75006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:30 compute-0 python3.9[75008]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 12 13:33:30 compute-0 sudo[75006]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:30 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 12 13:33:30 compute-0 sudo[75160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fniomjpovwvccqvyfcirrttqbzudmyqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224810.1901908-35-51030513213393/AnsiballZ_systemd.py'
Jan 12 13:33:30 compute-0 sudo[75160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:30 compute-0 python3.9[75164]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:33:30 compute-0 sudo[75160]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:31 compute-0 sudo[75315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnwlcadhovvlorlcqottjmjfxljjgmmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224810.798714-44-194501828701006/AnsiballZ_command.py'
Jan 12 13:33:31 compute-0 sudo[75315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:31 compute-0 python3.9[75317]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:33:31 compute-0 sudo[75315]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:31 compute-0 sudo[75468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbssuyqvkxfnukaiakktzuucgymvsotm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224811.4075692-52-207007539215525/AnsiballZ_stat.py'
Jan 12 13:33:31 compute-0 sudo[75468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:31 compute-0 python3.9[75470]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:33:31 compute-0 sudo[75468]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:32 compute-0 sudo[75622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctbhyigjpxpbettsmsmuwehgufmjgxxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224811.9897218-60-34420507972678/AnsiballZ_command.py'
Jan 12 13:33:32 compute-0 sudo[75622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:32 compute-0 python3.9[75624]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:33:32 compute-0 sudo[75622]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:32 compute-0 sudo[75777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnbdlvdqxfrciototpfxejetrecfnvdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224812.4586418-68-93594631702472/AnsiballZ_file.py'
Jan 12 13:33:32 compute-0 sudo[75777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:32 compute-0 python3.9[75779]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:32 compute-0 sudo[75777]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:33 compute-0 sshd-session[74702]: Connection closed by 192.168.122.30 port 47928
Jan 12 13:33:33 compute-0 sshd-session[74699]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:33:33 compute-0 systemd-logind[775]: Session 16 logged out. Waiting for processes to exit.
Jan 12 13:33:33 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Jan 12 13:33:33 compute-0 systemd[1]: session-16.scope: Consumed 3.074s CPU time.
Jan 12 13:33:33 compute-0 systemd-logind[775]: Removed session 16.
Jan 12 13:33:38 compute-0 sshd-session[75804]: Accepted publickey for zuul from 192.168.122.30 port 60292 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:33:38 compute-0 systemd-logind[775]: New session 17 of user zuul.
Jan 12 13:33:38 compute-0 systemd[1]: Started Session 17 of User zuul.
Jan 12 13:33:38 compute-0 sshd-session[75804]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:33:39 compute-0 python3.9[75957]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:33:39 compute-0 sudo[76111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqaxjodydkdojdjqnvzwulriiqprnscq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224819.5139344-29-77504675851096/AnsiballZ_setup.py'
Jan 12 13:33:39 compute-0 sudo[76111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:39 compute-0 python3.9[76113]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:33:40 compute-0 sudo[76111]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:40 compute-0 sudo[76195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxrsyapwsxffprivufsncmmoaguerxgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224819.5139344-29-77504675851096/AnsiballZ_dnf.py'
Jan 12 13:33:40 compute-0 sudo[76195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:40 compute-0 python3.9[76197]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 12 13:33:41 compute-0 sudo[76195]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:42 compute-0 python3.9[76348]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:33:43 compute-0 python3.9[76499]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 12 13:33:43 compute-0 python3.9[76649]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:33:44 compute-0 python3.9[76799]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:33:44 compute-0 sshd-session[75807]: Connection closed by 192.168.122.30 port 60292
Jan 12 13:33:44 compute-0 sshd-session[75804]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:33:44 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Jan 12 13:33:44 compute-0 systemd[1]: session-17.scope: Consumed 4.245s CPU time.
Jan 12 13:33:44 compute-0 systemd-logind[775]: Session 17 logged out. Waiting for processes to exit.
Jan 12 13:33:44 compute-0 systemd-logind[775]: Removed session 17.
Jan 12 13:33:49 compute-0 sshd-session[76824]: Accepted publickey for zuul from 192.168.122.30 port 48714 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:33:49 compute-0 systemd-logind[775]: New session 18 of user zuul.
Jan 12 13:33:49 compute-0 systemd[1]: Started Session 18 of User zuul.
Jan 12 13:33:49 compute-0 sshd-session[76824]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:33:50 compute-0 python3.9[76977]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:33:51 compute-0 sudo[77131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpitfeottqeyrgabnxpzutknvabebhlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224831.1035597-45-209410538302020/AnsiballZ_file.py'
Jan 12 13:33:51 compute-0 sudo[77131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:51 compute-0 python3.9[77133]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:33:51 compute-0 sudo[77131]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:51 compute-0 sudo[77283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyotcevjzpucivjkutjamqhqinqolkbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224831.6870956-45-123627716937359/AnsiballZ_file.py'
Jan 12 13:33:51 compute-0 sudo[77283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:52 compute-0 python3.9[77285]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:33:52 compute-0 sudo[77283]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:52 compute-0 sudo[77435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiduzummkwqcvffizdekwtquzxwwpdgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224832.1407716-60-195600083560049/AnsiballZ_stat.py'
Jan 12 13:33:52 compute-0 sudo[77435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:52 compute-0 python3.9[77437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:52 compute-0 sudo[77435]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:52 compute-0 sudo[77558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfqcivibhuykepnhjwzxkilmlzmxkczv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224832.1407716-60-195600083560049/AnsiballZ_copy.py'
Jan 12 13:33:52 compute-0 sudo[77558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:53 compute-0 python3.9[77560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224832.1407716-60-195600083560049/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=27ba9df8aeac638ef27afda00f73ac67bf60d9ea backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:53 compute-0 sudo[77558]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:53 compute-0 sudo[77710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxxovxxsdqtemrylqqkxwpxppcbauvev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224833.1964395-60-171227923542447/AnsiballZ_stat.py'
Jan 12 13:33:53 compute-0 sudo[77710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:53 compute-0 python3.9[77712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:53 compute-0 sudo[77710]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:53 compute-0 sudo[77833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agljawnsfvapmdgubcyomzjduwllkjit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224833.1964395-60-171227923542447/AnsiballZ_copy.py'
Jan 12 13:33:53 compute-0 sudo[77833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:53 compute-0 python3.9[77835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224833.1964395-60-171227923542447/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=34292d4858ee238800f37d6695bbf210ca4d3485 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:53 compute-0 sudo[77833]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:54 compute-0 sudo[77985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnpiopgfqttrbxgbxulgmmbwzjvwklgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224833.990461-60-154608080569469/AnsiballZ_stat.py'
Jan 12 13:33:54 compute-0 sudo[77985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:54 compute-0 python3.9[77987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:54 compute-0 sudo[77985]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:54 compute-0 sudo[78108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzhslwwilikczslqoudoitowixlpgfry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224833.990461-60-154608080569469/AnsiballZ_copy.py'
Jan 12 13:33:54 compute-0 sudo[78108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:54 compute-0 python3.9[78110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224833.990461-60-154608080569469/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=48b47e9fa58c06e386f5ada3caeea02805793a4f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:54 compute-0 sudo[78108]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:55 compute-0 sudo[78260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpwnxbgmcqzvammluvvoawlqadsakxga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224834.8666759-104-12458440124928/AnsiballZ_file.py'
Jan 12 13:33:55 compute-0 sudo[78260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:55 compute-0 python3.9[78262]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:33:55 compute-0 sudo[78260]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:55 compute-0 sudo[78412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzagkqhjkwmyzjgounhflvnatbhgmohf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224835.2876325-104-175741461425344/AnsiballZ_file.py'
Jan 12 13:33:55 compute-0 sudo[78412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:55 compute-0 python3.9[78414]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:33:55 compute-0 sudo[78412]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:55 compute-0 sudo[78564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehhouucqnyxoiieummrredcvcxrpyyqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224835.7654388-119-68629203618711/AnsiballZ_stat.py'
Jan 12 13:33:55 compute-0 sudo[78564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:56 compute-0 python3.9[78566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:56 compute-0 sudo[78564]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:56 compute-0 sudo[78687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghrvhhsghlbaevhwyfzyacqydnfsmxkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224835.7654388-119-68629203618711/AnsiballZ_copy.py'
Jan 12 13:33:56 compute-0 sudo[78687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:56 compute-0 python3.9[78689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224835.7654388-119-68629203618711/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=12270203809b0d23193e79fab726e8ef285b0d35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:56 compute-0 sudo[78687]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:56 compute-0 sudo[78839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpakuevsqyffsmswifkhkksjdwqbvfuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224836.5674036-119-238852256641933/AnsiballZ_stat.py'
Jan 12 13:33:56 compute-0 sudo[78839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:56 compute-0 python3.9[78841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:56 compute-0 sudo[78839]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:57 compute-0 sudo[78962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjwddstazmesvfliuhehfwdichmenolk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224836.5674036-119-238852256641933/AnsiballZ_copy.py'
Jan 12 13:33:57 compute-0 sudo[78962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:57 compute-0 python3.9[78964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224836.5674036-119-238852256641933/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f82dbe75d5a20158668f538176ec94e745af4a65 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:57 compute-0 sudo[78962]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:57 compute-0 sudo[79114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivwhinmfcrccktvxthsruoiatmircmcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224837.344893-119-103089303937169/AnsiballZ_stat.py'
Jan 12 13:33:57 compute-0 sudo[79114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:57 compute-0 python3.9[79116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:57 compute-0 sudo[79114]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:57 compute-0 sudo[79237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydqjmxsjawuwinjnqglooewhwmhutvcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224837.344893-119-103089303937169/AnsiballZ_copy.py'
Jan 12 13:33:57 compute-0 sudo[79237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:58 compute-0 python3.9[79239]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224837.344893-119-103089303937169/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=e8682fff50d34e24eeb44b083fca60a789c5cb0c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:58 compute-0 sudo[79237]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:58 compute-0 sudo[79389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtrjopgnfsuvztgnogbdigtkepxwdobt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224838.1920128-163-87143504455676/AnsiballZ_file.py'
Jan 12 13:33:58 compute-0 sudo[79389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:58 compute-0 python3.9[79391]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:33:58 compute-0 sudo[79389]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:58 compute-0 sudo[79541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewzirgfwyfyhmmbhooczncvfsjfxinhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224838.645486-163-199489479816568/AnsiballZ_file.py'
Jan 12 13:33:58 compute-0 sudo[79541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:58 compute-0 python3.9[79543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:33:58 compute-0 sudo[79541]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:59 compute-0 sudo[79693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqpiufifmaihbwwyeruxmphhfqnvdqvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224839.1073172-178-76187375368238/AnsiballZ_stat.py'
Jan 12 13:33:59 compute-0 sudo[79693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:59 compute-0 python3.9[79695]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:33:59 compute-0 sudo[79693]: pam_unix(sudo:session): session closed for user root
Jan 12 13:33:59 compute-0 sudo[79816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaetcolvckluxbpvmoebcdvoaebngxwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224839.1073172-178-76187375368238/AnsiballZ_copy.py'
Jan 12 13:33:59 compute-0 sudo[79816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:33:59 compute-0 python3.9[79818]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224839.1073172-178-76187375368238/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d238dc3101b1e8fb63d5a555b32f13c49f5d1974 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:33:59 compute-0 sudo[79816]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:00 compute-0 sudo[79968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqldavlbtegswmafphdvbtvagrfvhfqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224839.9030218-178-178914389533696/AnsiballZ_stat.py'
Jan 12 13:34:00 compute-0 sudo[79968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:00 compute-0 python3.9[79970]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:00 compute-0 sudo[79968]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:00 compute-0 sudo[80091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aybzelgimuhbqxrwxpkcjvcywllbbnak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224839.9030218-178-178914389533696/AnsiballZ_copy.py'
Jan 12 13:34:00 compute-0 sudo[80091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:00 compute-0 python3.9[80093]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224839.9030218-178-178914389533696/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=5b9c8da7da65ca00b9d6737b14bf6c8d443c22e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:00 compute-0 sudo[80091]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:00 compute-0 sudo[80243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waelmtucanrfwnfmjadezfupvzacndos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224840.686173-178-116885190999289/AnsiballZ_stat.py'
Jan 12 13:34:00 compute-0 sudo[80243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:00 compute-0 python3.9[80245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:01 compute-0 sudo[80243]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:01 compute-0 sudo[80366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buhrwvijvgvolvfyfcpaipnpzbwmtguz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224840.686173-178-116885190999289/AnsiballZ_copy.py'
Jan 12 13:34:01 compute-0 sudo[80366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:01 compute-0 python3.9[80368]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224840.686173-178-116885190999289/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=dd38fc501cd1689e97516749d12b8a3be8a89191 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:01 compute-0 sudo[80366]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:01 compute-0 sudo[80518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsfqoeozwvuhcdsjoabgzvftdumkstdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224841.5300348-222-107529099175414/AnsiballZ_file.py'
Jan 12 13:34:01 compute-0 sudo[80518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:01 compute-0 python3.9[80520]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:01 compute-0 sudo[80518]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:02 compute-0 sudo[80670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykwynqiypeosynksmhvjoqhbicelgtxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224841.9506793-222-211384100261325/AnsiballZ_file.py'
Jan 12 13:34:02 compute-0 sudo[80670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:02 compute-0 python3.9[80672]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:02 compute-0 sudo[80670]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:02 compute-0 sudo[80822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubjnzmnlidtluvpoizzyunohtesbrlsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224842.4177992-237-42327897416995/AnsiballZ_stat.py'
Jan 12 13:34:02 compute-0 sudo[80822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:02 compute-0 python3.9[80824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:02 compute-0 sudo[80822]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:02 compute-0 sudo[80945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlzelhivaaayddplzovlfquyktjkdsqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224842.4177992-237-42327897416995/AnsiballZ_copy.py'
Jan 12 13:34:02 compute-0 sudo[80945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:03 compute-0 python3.9[80947]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224842.4177992-237-42327897416995/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=cabd186804f4a6ea7cc88a77af37e2db4a35d330 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:03 compute-0 sudo[80945]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:03 compute-0 sudo[81097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdmhckllwqavfizbznkzsqwbdtclwxgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224843.2506385-237-214383714243506/AnsiballZ_stat.py'
Jan 12 13:34:03 compute-0 sudo[81097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:03 compute-0 python3.9[81099]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:03 compute-0 sudo[81097]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:03 compute-0 sudo[81220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkdtxcnrfhtvozuvoecczsexwwqwwgla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224843.2506385-237-214383714243506/AnsiballZ_copy.py'
Jan 12 13:34:03 compute-0 sudo[81220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:03 compute-0 python3.9[81222]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224843.2506385-237-214383714243506/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=5b9c8da7da65ca00b9d6737b14bf6c8d443c22e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:03 compute-0 sudo[81220]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:04 compute-0 sudo[81372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgwqhdvcagjnktjrwucgauotpqomxecx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224844.0775354-237-138583586365059/AnsiballZ_stat.py'
Jan 12 13:34:04 compute-0 sudo[81372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:04 compute-0 python3.9[81374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:04 compute-0 sudo[81372]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:04 compute-0 sudo[81495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydcnkvbqodkwlsuqafmjayhovmaejhnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224844.0775354-237-138583586365059/AnsiballZ_copy.py'
Jan 12 13:34:04 compute-0 sudo[81495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:04 compute-0 python3.9[81497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224844.0775354-237-138583586365059/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=2862a86fe8b1967da3733227e6e629d1e0462ce0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:04 compute-0 sudo[81495]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:05 compute-0 sudo[81647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ircnvxlggovcgixihxzhpgefspoopjfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224845.313254-297-72383212200103/AnsiballZ_file.py'
Jan 12 13:34:05 compute-0 sudo[81647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:05 compute-0 python3.9[81649]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:05 compute-0 sudo[81647]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:05 compute-0 sudo[81799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lktlwheczchbfkngklsmizapogdkdjmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224845.7598977-305-258471740744084/AnsiballZ_stat.py'
Jan 12 13:34:05 compute-0 sudo[81799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:06 compute-0 python3.9[81801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:06 compute-0 sudo[81799]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:06 compute-0 sudo[81922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbmnwdzfmnvxgxkbbitijshnierlrdct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224845.7598977-305-258471740744084/AnsiballZ_copy.py'
Jan 12 13:34:06 compute-0 sudo[81922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:06 compute-0 python3.9[81924]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224845.7598977-305-258471740744084/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0cac625bbd47decf33f75dea60fabb3d2b50744c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:06 compute-0 sudo[81922]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:06 compute-0 sudo[82074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqpyhcxjfwadtunppyjnvigfdqsiiits ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224846.6408534-321-97403977764209/AnsiballZ_file.py'
Jan 12 13:34:06 compute-0 sudo[82074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:06 compute-0 python3.9[82076]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:07 compute-0 sudo[82074]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:07 compute-0 sudo[82226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oicecwkkbkxnyrnkxfrfryvbgvlnpvau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224847.12154-329-89542539495255/AnsiballZ_stat.py'
Jan 12 13:34:07 compute-0 sudo[82226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:07 compute-0 python3.9[82228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:07 compute-0 sudo[82226]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:07 compute-0 sudo[82349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzzojmdgvzulolvxcdqbilkulusvkosi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224847.12154-329-89542539495255/AnsiballZ_copy.py'
Jan 12 13:34:07 compute-0 sudo[82349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:07 compute-0 python3.9[82351]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224847.12154-329-89542539495255/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0cac625bbd47decf33f75dea60fabb3d2b50744c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:07 compute-0 sudo[82349]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:08 compute-0 sudo[82501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofdzstxqsdoqbszszyjydbwjklmqizbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224848.0575593-345-198523685966451/AnsiballZ_file.py'
Jan 12 13:34:08 compute-0 sudo[82501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:08 compute-0 python3.9[82503]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:08 compute-0 sudo[82501]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:08 compute-0 sudo[82653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubkzixzsqjgoyqxvpkssewbtfieciyjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224848.6273968-353-280839257760189/AnsiballZ_stat.py'
Jan 12 13:34:08 compute-0 sudo[82653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:08 compute-0 python3.9[82655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:08 compute-0 sudo[82653]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:09 compute-0 sudo[82776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxzjjvdjtsgnjruavekxjjukmritgwad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224848.6273968-353-280839257760189/AnsiballZ_copy.py'
Jan 12 13:34:09 compute-0 sudo[82776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:09 compute-0 python3.9[82778]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224848.6273968-353-280839257760189/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0cac625bbd47decf33f75dea60fabb3d2b50744c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:09 compute-0 sudo[82776]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:09 compute-0 sudo[82928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtprclrclrboqnfwsnhnlgrxydnwhbsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224849.5249505-369-28864591261734/AnsiballZ_file.py'
Jan 12 13:34:09 compute-0 sudo[82928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:09 compute-0 python3.9[82930]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:09 compute-0 sudo[82928]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:10 compute-0 sudo[83080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umkihcxzwscrgzteymatmopkvgvbzwew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224849.9825187-377-110881114777144/AnsiballZ_stat.py'
Jan 12 13:34:10 compute-0 sudo[83080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:10 compute-0 python3.9[83082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:10 compute-0 sudo[83080]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:10 compute-0 sudo[83203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjjgvylumdpzfqknswxrlgamoypuuikk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224849.9825187-377-110881114777144/AnsiballZ_copy.py'
Jan 12 13:34:10 compute-0 sudo[83203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:10 compute-0 python3.9[83205]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224849.9825187-377-110881114777144/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0cac625bbd47decf33f75dea60fabb3d2b50744c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:10 compute-0 sudo[83203]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:11 compute-0 sudo[83355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iotzacqtvxcmwisgwwckvxoqvaurgucs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224850.8997035-393-161001378585031/AnsiballZ_file.py'
Jan 12 13:34:11 compute-0 sudo[83355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:11 compute-0 python3.9[83357]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:11 compute-0 sudo[83355]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:11 compute-0 sudo[83507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kslwtapbpkqllrovnpjdgchlhytxjzso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224851.3554661-401-122422286796573/AnsiballZ_stat.py'
Jan 12 13:34:11 compute-0 sudo[83507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:11 compute-0 python3.9[83509]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:11 compute-0 sudo[83507]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:11 compute-0 sudo[83630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgdjxzgdjcybrqejhyqdnsylomvungwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224851.3554661-401-122422286796573/AnsiballZ_copy.py'
Jan 12 13:34:11 compute-0 sudo[83630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:12 compute-0 python3.9[83632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224851.3554661-401-122422286796573/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0cac625bbd47decf33f75dea60fabb3d2b50744c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:12 compute-0 sudo[83630]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:12 compute-0 sudo[83782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mltytirmbzcvwnjkuvfxpiqgmgpatjxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224852.233757-417-46753853015854/AnsiballZ_file.py'
Jan 12 13:34:12 compute-0 sudo[83782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:12 compute-0 python3.9[83784]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:12 compute-0 sudo[83782]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:13 compute-0 sudo[83934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjflqvvbihcwlxdzhgwsoxkvmttbhrft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224852.7951741-425-204975264575711/AnsiballZ_stat.py'
Jan 12 13:34:13 compute-0 sudo[83934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:13 compute-0 python3.9[83936]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:13 compute-0 sudo[83934]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:13 compute-0 sudo[84057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwyblwewsivkbwjqwpewupzmantrdisa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224852.7951741-425-204975264575711/AnsiballZ_copy.py'
Jan 12 13:34:13 compute-0 sudo[84057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:13 compute-0 python3.9[84059]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224852.7951741-425-204975264575711/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0cac625bbd47decf33f75dea60fabb3d2b50744c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:13 compute-0 sudo[84057]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:13 compute-0 sudo[84209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wutyknqhfggpcjtxmfdrgrvvlgtrbsaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224853.7215483-441-266629254696995/AnsiballZ_file.py'
Jan 12 13:34:13 compute-0 sudo[84209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:14 compute-0 python3.9[84211]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:14 compute-0 sudo[84209]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:14 compute-0 sudo[84361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ignkxzaizyujmcbbzhvsvfqcmstzmlid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224854.1839573-449-67971112353191/AnsiballZ_stat.py'
Jan 12 13:34:14 compute-0 sudo[84361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:14 compute-0 python3.9[84363]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:14 compute-0 sudo[84361]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:14 compute-0 sudo[84484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfdmdzzejifdwjhqbklsgemeoyrhhvrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224854.1839573-449-67971112353191/AnsiballZ_copy.py'
Jan 12 13:34:14 compute-0 sudo[84484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:14 compute-0 python3.9[84486]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224854.1839573-449-67971112353191/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0cac625bbd47decf33f75dea60fabb3d2b50744c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:14 compute-0 sudo[84484]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:15 compute-0 sshd-session[76827]: Connection closed by 192.168.122.30 port 48714
Jan 12 13:34:15 compute-0 sshd-session[76824]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:34:15 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Jan 12 13:34:15 compute-0 systemd[1]: session-18.scope: Consumed 19.407s CPU time.
Jan 12 13:34:15 compute-0 systemd-logind[775]: Session 18 logged out. Waiting for processes to exit.
Jan 12 13:34:15 compute-0 systemd-logind[775]: Removed session 18.
Jan 12 13:34:19 compute-0 sshd-session[84511]: Accepted publickey for zuul from 192.168.122.30 port 33842 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:34:19 compute-0 systemd-logind[775]: New session 19 of user zuul.
Jan 12 13:34:19 compute-0 systemd[1]: Started Session 19 of User zuul.
Jan 12 13:34:19 compute-0 sshd-session[84511]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:34:20 compute-0 python3.9[84664]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:34:21 compute-0 sudo[84818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czqveozrdarhpdjggmvosohmtkovynow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224861.0424302-29-2838147571432/AnsiballZ_file.py'
Jan 12 13:34:21 compute-0 sudo[84818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:21 compute-0 python3.9[84820]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:21 compute-0 sudo[84818]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:21 compute-0 sudo[84970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjsgtsgsfvbqaqhqlppxhdwhhglabrjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224861.6115465-29-181582839612157/AnsiballZ_file.py'
Jan 12 13:34:21 compute-0 sudo[84970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:21 compute-0 python3.9[84972]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:21 compute-0 sudo[84970]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:22 compute-0 python3.9[85122]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:34:22 compute-0 sudo[85272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etklcuygouktnkohnaukfdkvhmqxrjiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224862.5858939-52-158386374206591/AnsiballZ_seboolean.py'
Jan 12 13:34:22 compute-0 sudo[85272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:23 compute-0 python3.9[85274]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 12 13:34:23 compute-0 sudo[85272]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:24 compute-0 sudo[85428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtpujsgdejpnyldhxjuyvnqcsxpkgptj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224863.9943202-62-265315078811048/AnsiballZ_setup.py'
Jan 12 13:34:24 compute-0 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 12 13:34:24 compute-0 sudo[85428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:24 compute-0 python3.9[85430]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:34:24 compute-0 sudo[85428]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:24 compute-0 sudo[85512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blgqntoirifzcaorzcjnkpmqxhrrzbov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224863.9943202-62-265315078811048/AnsiballZ_dnf.py'
Jan 12 13:34:24 compute-0 sudo[85512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:25 compute-0 python3.9[85514]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:34:26 compute-0 sudo[85512]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:26 compute-0 sudo[85665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdrrnhipjzmgkfadxlgnhqbilxyysjoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224866.1219158-74-95736326108260/AnsiballZ_systemd.py'
Jan 12 13:34:26 compute-0 sudo[85665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:26 compute-0 python3.9[85667]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 12 13:34:26 compute-0 sudo[85665]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:27 compute-0 sudo[85820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkbtlbaawrhrzppynpqllplshxoymynd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768224866.9397488-82-95117677953361/AnsiballZ_edpm_nftables_snippet.py'
Jan 12 13:34:27 compute-0 sudo[85820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:27 compute-0 python3[85822]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 12 13:34:27 compute-0 sudo[85820]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:27 compute-0 sudo[85972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeykjgfvvcsvzswycnpfxefvwknedzkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224867.5782132-91-167549399227454/AnsiballZ_file.py'
Jan 12 13:34:27 compute-0 sudo[85972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:27 compute-0 python3.9[85974]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:27 compute-0 sudo[85972]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:28 compute-0 sudo[86124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agtbdfwtgegitqlbaikbdtezddyvqwbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224868.0433607-99-160989624973755/AnsiballZ_stat.py'
Jan 12 13:34:28 compute-0 sudo[86124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:28 compute-0 python3.9[86126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:28 compute-0 sudo[86124]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:28 compute-0 sudo[86202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgvjlhxzxwijuguohnabgfviiqfoxjtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224868.0433607-99-160989624973755/AnsiballZ_file.py'
Jan 12 13:34:28 compute-0 sudo[86202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:28 compute-0 python3.9[86204]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:28 compute-0 sudo[86202]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:29 compute-0 sudo[86354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxkcwtzqhgehcciemzjelmqdyvvwfdih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224868.946535-111-276998857015675/AnsiballZ_stat.py'
Jan 12 13:34:29 compute-0 sudo[86354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:29 compute-0 python3.9[86356]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:29 compute-0 sudo[86354]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:29 compute-0 sudo[86432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxcgxcavqlfzlajncwcodzfzxhoesvmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224868.946535-111-276998857015675/AnsiballZ_file.py'
Jan 12 13:34:29 compute-0 sudo[86432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:29 compute-0 python3.9[86434]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xjj03ukz recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:29 compute-0 sudo[86432]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:29 compute-0 sudo[86584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcuakwcospwgqmhlinbmznnmihjxprsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224869.7944276-123-92164489697757/AnsiballZ_stat.py'
Jan 12 13:34:29 compute-0 sudo[86584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:30 compute-0 python3.9[86586]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:30 compute-0 sudo[86584]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:30 compute-0 sudo[86662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyzzhowpmfjwggnbxovkwmwejqdsyigp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224869.7944276-123-92164489697757/AnsiballZ_file.py'
Jan 12 13:34:30 compute-0 sudo[86662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:30 compute-0 python3.9[86664]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:30 compute-0 sudo[86662]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:30 compute-0 sudo[86814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sufdmvpaaujyaorgepkyevdswsgttzyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224870.5782328-136-187991905418454/AnsiballZ_command.py'
Jan 12 13:34:30 compute-0 sudo[86814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:31 compute-0 python3.9[86816]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:34:31 compute-0 sudo[86814]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:31 compute-0 sudo[86967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlvwvwfxhqjnhshcgvhhgcwhqbaxfqcj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768224871.1465359-144-253500038805150/AnsiballZ_edpm_nftables_from_files.py'
Jan 12 13:34:31 compute-0 sudo[86967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:31 compute-0 python3[86969]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 12 13:34:31 compute-0 sudo[86967]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:31 compute-0 sudo[87119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seqmzfpqlkvunrjxrlsrolskgbdigwvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224871.7069328-152-16605762086994/AnsiballZ_stat.py'
Jan 12 13:34:31 compute-0 sudo[87119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:32 compute-0 python3.9[87121]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:32 compute-0 sudo[87119]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:32 compute-0 sudo[87244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evbstrdycdafvhddqfcqfswkfinwsziy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224871.7069328-152-16605762086994/AnsiballZ_copy.py'
Jan 12 13:34:32 compute-0 sudo[87244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:32 compute-0 python3.9[87246]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224871.7069328-152-16605762086994/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:32 compute-0 sudo[87244]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:32 compute-0 sudo[87396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byiohvwrjhymkfrtgufemezytoqctyum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224872.6473253-167-140848088382511/AnsiballZ_stat.py'
Jan 12 13:34:32 compute-0 sudo[87396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:32 compute-0 python3.9[87398]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:33 compute-0 sudo[87396]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:33 compute-0 sudo[87521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmkidcbznytjyxadisikuhviwmeghmim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224872.6473253-167-140848088382511/AnsiballZ_copy.py'
Jan 12 13:34:33 compute-0 sudo[87521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:33 compute-0 python3.9[87523]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224872.6473253-167-140848088382511/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:33 compute-0 sudo[87521]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:33 compute-0 sudo[87673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppfqdsmqjkjqvwxhwzhrfajmlilpinmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224873.490164-182-66771350060394/AnsiballZ_stat.py'
Jan 12 13:34:33 compute-0 sudo[87673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:33 compute-0 python3.9[87675]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:33 compute-0 sudo[87673]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:34 compute-0 sudo[87798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltjnunhdslhdhqirxkxggukxrerpmjax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224873.490164-182-66771350060394/AnsiballZ_copy.py'
Jan 12 13:34:34 compute-0 sudo[87798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:34 compute-0 python3.9[87800]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224873.490164-182-66771350060394/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:34 compute-0 sudo[87798]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:34 compute-0 sudo[87950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgupuvkpghdpchvddydyauatedfdvnhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224874.309267-197-59788662302306/AnsiballZ_stat.py'
Jan 12 13:34:34 compute-0 sudo[87950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:34 compute-0 python3.9[87952]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:34 compute-0 sudo[87950]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:34 compute-0 sudo[88075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfrnhcdwsxpdmlphoupmobmwvaxqumvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224874.309267-197-59788662302306/AnsiballZ_copy.py'
Jan 12 13:34:34 compute-0 sudo[88075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:34 compute-0 python3.9[88077]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224874.309267-197-59788662302306/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:35 compute-0 sudo[88075]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:35 compute-0 sudo[88227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tciuigyblwdadfdfkinacqamresvlmja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224875.120019-212-108141474775054/AnsiballZ_stat.py'
Jan 12 13:34:35 compute-0 sudo[88227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:35 compute-0 python3.9[88229]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:35 compute-0 sudo[88227]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:35 compute-0 sudo[88352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbktjzpkedyuqhyxdwfilrnkwnllktmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224875.120019-212-108141474775054/AnsiballZ_copy.py'
Jan 12 13:34:35 compute-0 sudo[88352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:35 compute-0 python3.9[88354]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768224875.120019-212-108141474775054/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:35 compute-0 sudo[88352]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:36 compute-0 sudo[88504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cimmeoeobkeaxjhlkyiejekprueyisjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224875.9997108-227-258973626557760/AnsiballZ_file.py'
Jan 12 13:34:36 compute-0 sudo[88504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:36 compute-0 python3.9[88506]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:36 compute-0 sudo[88504]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:36 compute-0 sudo[88656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaqlnzxsvutyndxxzayalsogheiyfrat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224876.449732-235-62980227703651/AnsiballZ_command.py'
Jan 12 13:34:36 compute-0 sudo[88656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:36 compute-0 python3.9[88658]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:34:36 compute-0 sudo[88656]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:37 compute-0 sudo[88811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmnbiuwyvpowkolhsuvmntfbfbfiofql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224876.9318402-243-107021754506988/AnsiballZ_blockinfile.py'
Jan 12 13:34:37 compute-0 sudo[88811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:37 compute-0 python3.9[88813]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:37 compute-0 sudo[88811]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:37 compute-0 sudo[88963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aggupmrtwucdyjmldpyjxlassfdkjkun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224877.53271-252-56032302941395/AnsiballZ_command.py'
Jan 12 13:34:37 compute-0 sudo[88963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:37 compute-0 python3.9[88965]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:34:37 compute-0 sudo[88963]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:38 compute-0 sudo[89116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wocynzvncbmipbdtpsgjabjhcpbmmhrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224878.0039053-260-206985077331944/AnsiballZ_stat.py'
Jan 12 13:34:38 compute-0 sudo[89116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:38 compute-0 python3.9[89118]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:34:38 compute-0 sudo[89116]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:38 compute-0 sudo[89270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imqdvhgmzpekitlwtcqujfzgwvnsviux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224878.4577274-268-79219890476610/AnsiballZ_command.py'
Jan 12 13:34:38 compute-0 sudo[89270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:38 compute-0 python3.9[89272]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:34:38 compute-0 sudo[89270]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:39 compute-0 sudo[89425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxwboxivutcgfvctfqrvhohmbutwjkwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224878.943301-276-203999742888513/AnsiballZ_file.py'
Jan 12 13:34:39 compute-0 sudo[89425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:39 compute-0 python3.9[89427]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:39 compute-0 sudo[89425]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:40 compute-0 python3.9[89577]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:34:40 compute-0 sudo[89728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcfdlamwptnktdsrnyowsxiawfcuantu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224880.5643244-316-177081479037160/AnsiballZ_command.py'
Jan 12 13:34:40 compute-0 sudo[89728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:40 compute-0 python3.9[89730]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:9d:bd:06:c0" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:34:40 compute-0 ovs-vsctl[89731]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:9d:bd:06:c0 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 12 13:34:40 compute-0 sudo[89728]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:41 compute-0 sudo[89881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfeyhocwtbsblbujfzxdsjxkuuiruffc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224881.0388892-325-90832297606800/AnsiballZ_command.py'
Jan 12 13:34:41 compute-0 sudo[89881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:41 compute-0 python3.9[89883]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:34:41 compute-0 sudo[89881]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:41 compute-0 sudo[90036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgmletqlzxkhxqcpgxpkupbgecnrixea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224881.4830186-333-76514422663071/AnsiballZ_command.py'
Jan 12 13:34:41 compute-0 sudo[90036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:41 compute-0 python3.9[90038]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:34:41 compute-0 ovs-vsctl[90039]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 12 13:34:41 compute-0 sudo[90036]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:42 compute-0 python3.9[90189]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:34:42 compute-0 sudo[90341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hczlvmdvxwxkhrqfjqidtmryhrmfomlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224882.4070888-350-236214236554698/AnsiballZ_file.py'
Jan 12 13:34:42 compute-0 sudo[90341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:42 compute-0 python3.9[90343]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:42 compute-0 sudo[90341]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:43 compute-0 sudo[90493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbvuskbnhbdfrlildddhodeyzmngyrwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224882.8557167-358-150424081542004/AnsiballZ_stat.py'
Jan 12 13:34:43 compute-0 sudo[90493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:43 compute-0 python3.9[90495]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:43 compute-0 sudo[90493]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:43 compute-0 sudo[90571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfbedpwoiyhupsxuotwpunckyoibhjfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224882.8557167-358-150424081542004/AnsiballZ_file.py'
Jan 12 13:34:43 compute-0 sudo[90571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:43 compute-0 python3.9[90573]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:43 compute-0 sudo[90571]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:43 compute-0 sudo[90723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayutpbnyjznfqvwzxidslwxaaqolfapw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224883.5679784-358-144236390021647/AnsiballZ_stat.py'
Jan 12 13:34:43 compute-0 sudo[90723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:43 compute-0 chronyd[64547]: Selected source 5.161.111.190 (pool.ntp.org)
Jan 12 13:34:43 compute-0 python3.9[90725]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:43 compute-0 sudo[90723]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:44 compute-0 sudo[90801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zronkcggqyzlkixjiaukdunamgdndbci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224883.5679784-358-144236390021647/AnsiballZ_file.py'
Jan 12 13:34:44 compute-0 sudo[90801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:44 compute-0 python3.9[90803]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:44 compute-0 sudo[90801]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:44 compute-0 sudo[90953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urnypxzpvnyunmtaihedoaaoukeuzulv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224884.3289254-381-108974538647041/AnsiballZ_file.py'
Jan 12 13:34:44 compute-0 sudo[90953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:44 compute-0 python3.9[90955]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:44 compute-0 sudo[90953]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:44 compute-0 sudo[91105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awmxiqufwmkfdywpstgvdddacdpyjeff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224884.7656438-389-70136151628822/AnsiballZ_stat.py'
Jan 12 13:34:44 compute-0 sudo[91105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:45 compute-0 python3.9[91107]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:45 compute-0 sudo[91105]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:45 compute-0 sudo[91183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyfkbzbzcdbcdyrqmqiucpvzedyluizs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224884.7656438-389-70136151628822/AnsiballZ_file.py'
Jan 12 13:34:45 compute-0 sudo[91183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:45 compute-0 python3.9[91185]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:45 compute-0 sudo[91183]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:45 compute-0 sudo[91335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snqfkbmmhuorqmmxwgunvycmirmqwexl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224885.5230315-401-100160360873309/AnsiballZ_stat.py'
Jan 12 13:34:45 compute-0 sudo[91335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:45 compute-0 python3.9[91337]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:45 compute-0 sudo[91335]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:45 compute-0 sudo[91413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hywplvehfiepqbhygnczbdrjkkjfoavf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224885.5230315-401-100160360873309/AnsiballZ_file.py'
Jan 12 13:34:45 compute-0 sudo[91413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:46 compute-0 python3.9[91415]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:46 compute-0 sudo[91413]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:46 compute-0 sudo[91565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezauvihdtbblrcgwgvsjjjqhvnubiohv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224886.2519436-413-225976104032088/AnsiballZ_systemd.py'
Jan 12 13:34:46 compute-0 sudo[91565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:46 compute-0 python3.9[91567]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:34:46 compute-0 systemd[1]: Reloading.
Jan 12 13:34:46 compute-0 systemd-rc-local-generator[91591]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:34:46 compute-0 systemd-sysv-generator[91594]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:34:46 compute-0 sudo[91565]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:47 compute-0 sudo[91755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jobvgpnskmzbggngyeoajzetdhvrsbap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224886.9644597-421-233930672472722/AnsiballZ_stat.py'
Jan 12 13:34:47 compute-0 sudo[91755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:47 compute-0 python3.9[91757]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:47 compute-0 sudo[91755]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:47 compute-0 sudo[91833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzmvqfrhpbkvnvtghufdsmmsnrakrkzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224886.9644597-421-233930672472722/AnsiballZ_file.py'
Jan 12 13:34:47 compute-0 sudo[91833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:47 compute-0 python3.9[91835]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:47 compute-0 sudo[91833]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:47 compute-0 sudo[91985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrosoemauzqfpttxvoehfhsdfcfdrooo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224887.7704134-433-56545032984606/AnsiballZ_stat.py'
Jan 12 13:34:47 compute-0 sudo[91985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:48 compute-0 python3.9[91987]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:48 compute-0 sudo[91985]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:48 compute-0 sudo[92063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dslhfqwhwcgrinpjeqtrmnjtnvyguhpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224887.7704134-433-56545032984606/AnsiballZ_file.py'
Jan 12 13:34:48 compute-0 sudo[92063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:48 compute-0 python3.9[92065]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:48 compute-0 sudo[92063]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:48 compute-0 sudo[92215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gytslnznnghyedslkomkhirxsaulkjnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224888.5241878-445-175204952384194/AnsiballZ_systemd.py'
Jan 12 13:34:48 compute-0 sudo[92215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:48 compute-0 python3.9[92217]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:34:48 compute-0 systemd[1]: Reloading.
Jan 12 13:34:49 compute-0 systemd-rc-local-generator[92241]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:34:49 compute-0 systemd-sysv-generator[92244]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:34:49 compute-0 systemd[1]: Starting Create netns directory...
Jan 12 13:34:49 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 12 13:34:49 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 12 13:34:49 compute-0 systemd[1]: Finished Create netns directory.
Jan 12 13:34:49 compute-0 sudo[92215]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:49 compute-0 sudo[92409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smvpzxcqrmezgylbmpsgudjkveckjgep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224889.371042-455-258100406936386/AnsiballZ_file.py'
Jan 12 13:34:49 compute-0 sudo[92409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:49 compute-0 python3.9[92411]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:49 compute-0 sudo[92409]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:49 compute-0 sudo[92561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rccjfnlzvhlfbborbnwwlclqbnibpzcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224889.8341758-463-57973162618132/AnsiballZ_stat.py'
Jan 12 13:34:49 compute-0 sudo[92561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:50 compute-0 python3.9[92563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:50 compute-0 sudo[92561]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:50 compute-0 sudo[92684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azdrezukwrmflfidiisoqqqzisncmolv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224889.8341758-463-57973162618132/AnsiballZ_copy.py'
Jan 12 13:34:50 compute-0 sudo[92684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:50 compute-0 python3.9[92686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224889.8341758-463-57973162618132/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:50 compute-0 sudo[92684]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:51 compute-0 sudo[92836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdjsyfqxriqukdodkhujavdcrkatouvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224890.8130949-480-39823546934930/AnsiballZ_file.py'
Jan 12 13:34:51 compute-0 sudo[92836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:51 compute-0 python3.9[92838]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:51 compute-0 sudo[92836]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:51 compute-0 sudo[92988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsvagdtxurwqbroczzigxbpwqigetdkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224891.3417046-488-180091779226209/AnsiballZ_file.py'
Jan 12 13:34:51 compute-0 sudo[92988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:51 compute-0 python3.9[92990]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:34:51 compute-0 sudo[92988]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:52 compute-0 sudo[93140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eowihwhntfvczeskoxmlfupsbnhucyhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224891.8397207-496-111330105096035/AnsiballZ_stat.py'
Jan 12 13:34:52 compute-0 sudo[93140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:52 compute-0 python3.9[93142]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:34:52 compute-0 sudo[93140]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:52 compute-0 sudo[93263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pornmhriaslzkwwviuifaklvonbiiiyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224891.8397207-496-111330105096035/AnsiballZ_copy.py'
Jan 12 13:34:52 compute-0 sudo[93263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:52 compute-0 python3.9[93265]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224891.8397207-496-111330105096035/.source.json _original_basename=.96h25eg1 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:52 compute-0 sudo[93263]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:52 compute-0 python3.9[93415]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:54 compute-0 sudo[93836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atdclmxmcliizdxgngjnxonyjoiloblp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224894.0268896-536-142598312343616/AnsiballZ_container_config_data.py'
Jan 12 13:34:54 compute-0 sudo[93836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:54 compute-0 python3.9[93838]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 12 13:34:54 compute-0 sudo[93836]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:55 compute-0 sudo[93988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojszrmivilicvdigdsqiedvolshgfgqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224894.7307703-547-253783689230483/AnsiballZ_container_config_hash.py'
Jan 12 13:34:55 compute-0 sudo[93988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:55 compute-0 python3.9[93990]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 12 13:34:55 compute-0 sudo[93988]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:55 compute-0 sudo[94140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffbxmtbdufjmlkffxjyfvhqqgsujemdz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768224895.4168394-557-278310393080881/AnsiballZ_edpm_container_manage.py'
Jan 12 13:34:55 compute-0 sudo[94140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:55 compute-0 python3[94142]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 12 13:34:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:34:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:34:56 compute-0 podman[94171]: 2026-01-12 13:34:56.087439449 +0000 UTC m=+0.026758774 container create 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 12 13:34:56 compute-0 podman[94171]: 2026-01-12 13:34:56.07552387 +0000 UTC m=+0.014843204 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 12 13:34:56 compute-0 python3[94142]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 12 13:34:56 compute-0 sudo[94140]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:56 compute-0 sudo[94349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjbgrabmjgyrxwhzqokyilkeyatbyuza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224896.2861276-565-122133151430188/AnsiballZ_stat.py'
Jan 12 13:34:56 compute-0 sudo[94349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:56 compute-0 python3.9[94351]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:34:56 compute-0 sudo[94349]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:56 compute-0 sudo[94503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwlygwnyspnhbgdczwfwuyhrhzjgzwsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224896.8068442-574-99796510797493/AnsiballZ_file.py'
Jan 12 13:34:56 compute-0 sudo[94503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 12 13:34:57 compute-0 python3.9[94505]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:57 compute-0 sudo[94503]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:57 compute-0 sudo[94579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfemawwpuswsdfnwcxtnzdhqhpfaiwhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224896.8068442-574-99796510797493/AnsiballZ_stat.py'
Jan 12 13:34:57 compute-0 sudo[94579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:57 compute-0 python3.9[94581]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:34:57 compute-0 sudo[94579]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:57 compute-0 sudo[94730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeeropigpfbbnfbiwcgevmsixxsbungm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224897.4797184-574-88237930234875/AnsiballZ_copy.py'
Jan 12 13:34:57 compute-0 sudo[94730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:57 compute-0 python3.9[94732]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768224897.4797184-574-88237930234875/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:34:57 compute-0 sudo[94730]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:58 compute-0 sudo[94806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjqadgzmuaclibxsrfptlvdvvhtoglqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224897.4797184-574-88237930234875/AnsiballZ_systemd.py'
Jan 12 13:34:58 compute-0 sudo[94806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:58 compute-0 python3.9[94808]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:34:58 compute-0 systemd[1]: Reloading.
Jan 12 13:34:58 compute-0 systemd-rc-local-generator[94830]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:34:58 compute-0 systemd-sysv-generator[94833]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:34:58 compute-0 sudo[94806]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:58 compute-0 sudo[94918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujgvlkbfystwrsxlatfpfwvbvqppfofj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224897.4797184-574-88237930234875/AnsiballZ_systemd.py'
Jan 12 13:34:58 compute-0 sudo[94918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:34:58 compute-0 python3.9[94920]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:34:58 compute-0 systemd[1]: Reloading.
Jan 12 13:34:59 compute-0 systemd-rc-local-generator[94943]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:34:59 compute-0 systemd-sysv-generator[94946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:34:59 compute-0 systemd[1]: Starting ovn_controller container...
Jan 12 13:34:59 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 12 13:34:59 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:34:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/add66e3d65e04e677f096af6912c7b277ba0cfe4ab2cfa9eb9011ff7da8e9a7b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 12 13:34:59 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c.
Jan 12 13:34:59 compute-0 podman[94961]: 2026-01-12 13:34:59.243186706 +0000 UTC m=+0.085591202 container init 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 12 13:34:59 compute-0 ovn_controller[94974]: + sudo -E kolla_set_configs
Jan 12 13:34:59 compute-0 podman[94961]: 2026-01-12 13:34:59.265732249 +0000 UTC m=+0.108136726 container start 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 12 13:34:59 compute-0 edpm-start-podman-container[94961]: ovn_controller
Jan 12 13:34:59 compute-0 systemd[1]: Created slice User Slice of UID 0.
Jan 12 13:34:59 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 12 13:34:59 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 12 13:34:59 compute-0 systemd[1]: Starting User Manager for UID 0...
Jan 12 13:34:59 compute-0 systemd[95004]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 12 13:34:59 compute-0 edpm-start-podman-container[94960]: Creating additional drop-in dependency for "ovn_controller" (317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c)
Jan 12 13:34:59 compute-0 podman[94981]: 2026-01-12 13:34:59.331651208 +0000 UTC m=+0.056547153 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 12 13:34:59 compute-0 systemd[1]: Reloading.
Jan 12 13:34:59 compute-0 systemd-rc-local-generator[95049]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:34:59 compute-0 systemd[95004]: Queued start job for default target Main User Target.
Jan 12 13:34:59 compute-0 systemd[95004]: Created slice User Application Slice.
Jan 12 13:34:59 compute-0 systemd[95004]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 12 13:34:59 compute-0 systemd[95004]: Started Daily Cleanup of User's Temporary Directories.
Jan 12 13:34:59 compute-0 systemd[95004]: Reached target Paths.
Jan 12 13:34:59 compute-0 systemd[95004]: Reached target Timers.
Jan 12 13:34:59 compute-0 systemd[95004]: Starting D-Bus User Message Bus Socket...
Jan 12 13:34:59 compute-0 systemd-sysv-generator[95052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:34:59 compute-0 systemd[95004]: Starting Create User's Volatile Files and Directories...
Jan 12 13:34:59 compute-0 systemd[95004]: Finished Create User's Volatile Files and Directories.
Jan 12 13:34:59 compute-0 systemd[95004]: Listening on D-Bus User Message Bus Socket.
Jan 12 13:34:59 compute-0 systemd[95004]: Reached target Sockets.
Jan 12 13:34:59 compute-0 systemd[95004]: Reached target Basic System.
Jan 12 13:34:59 compute-0 systemd[95004]: Reached target Main User Target.
Jan 12 13:34:59 compute-0 systemd[95004]: Startup finished in 95ms.
Jan 12 13:34:59 compute-0 systemd[1]: Started User Manager for UID 0.
Jan 12 13:34:59 compute-0 systemd[1]: Started ovn_controller container.
Jan 12 13:34:59 compute-0 systemd[1]: 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c-5329e144af5ca291.service: Main process exited, code=exited, status=1/FAILURE
Jan 12 13:34:59 compute-0 systemd[1]: 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c-5329e144af5ca291.service: Failed with result 'exit-code'.
Jan 12 13:34:59 compute-0 systemd[1]: Started Session c1 of User root.
Jan 12 13:34:59 compute-0 sudo[94918]: pam_unix(sudo:session): session closed for user root
Jan 12 13:34:59 compute-0 ovn_controller[94974]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 12 13:34:59 compute-0 ovn_controller[94974]: INFO:__main__:Validating config file
Jan 12 13:34:59 compute-0 ovn_controller[94974]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 12 13:34:59 compute-0 ovn_controller[94974]: INFO:__main__:Writing out command to execute
Jan 12 13:34:59 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 12 13:34:59 compute-0 ovn_controller[94974]: ++ cat /run_command
Jan 12 13:34:59 compute-0 ovn_controller[94974]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 12 13:34:59 compute-0 ovn_controller[94974]: + ARGS=
Jan 12 13:34:59 compute-0 ovn_controller[94974]: + sudo kolla_copy_cacerts
Jan 12 13:34:59 compute-0 systemd[1]: Started Session c2 of User root.
Jan 12 13:34:59 compute-0 ovn_controller[94974]: + [[ ! -n '' ]]
Jan 12 13:34:59 compute-0 ovn_controller[94974]: + . kolla_extend_start
Jan 12 13:34:59 compute-0 ovn_controller[94974]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 12 13:34:59 compute-0 ovn_controller[94974]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 12 13:34:59 compute-0 ovn_controller[94974]: + umask 0022
Jan 12 13:34:59 compute-0 ovn_controller[94974]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 12 13:34:59 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 12 13:34:59 compute-0 NetworkManager[55211]: <info>  [1768224899.6421] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 12 13:34:59 compute-0 NetworkManager[55211]: <info>  [1768224899.6425] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:34:59 compute-0 NetworkManager[55211]: <warn>  [1768224899.6426] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 12 13:34:59 compute-0 NetworkManager[55211]: <info>  [1768224899.6431] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 12 13:34:59 compute-0 NetworkManager[55211]: <info>  [1768224899.6435] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 12 13:34:59 compute-0 NetworkManager[55211]: <info>  [1768224899.6437] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 12 13:34:59 compute-0 kernel: br-int: entered promiscuous mode
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00024|main|INFO|OVS feature set changed, force recompute.
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 12 13:34:59 compute-0 ovn_controller[94974]: 2026-01-12T13:34:59Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 12 13:34:59 compute-0 NetworkManager[55211]: <info>  [1768224899.6565] manager: (ovn-212439-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 12 13:34:59 compute-0 systemd-udevd[95104]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:34:59 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Jan 12 13:34:59 compute-0 systemd-udevd[95106]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:34:59 compute-0 NetworkManager[55211]: <info>  [1768224899.6680] device (genev_sys_6081): carrier: link connected
Jan 12 13:34:59 compute-0 NetworkManager[55211]: <info>  [1768224899.6682] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Jan 12 13:35:00 compute-0 python3.9[95234]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 12 13:35:00 compute-0 sudo[95384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yikghxvuwxozvdaoxwhhtphtkqtbwumr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224900.5146704-619-150083060232568/AnsiballZ_stat.py'
Jan 12 13:35:00 compute-0 sudo[95384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:00 compute-0 python3.9[95386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:00 compute-0 sudo[95384]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:01 compute-0 sudo[95507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvagzivpkvkrqmvsgqcntljmsqksjxnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224900.5146704-619-150083060232568/AnsiballZ_copy.py'
Jan 12 13:35:01 compute-0 sudo[95507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:01 compute-0 python3.9[95509]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224900.5146704-619-150083060232568/.source.yaml _original_basename=.gi1z6trl follow=False checksum=b63b068aa43faeb041156366f5365e2ac46469f0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:01 compute-0 sudo[95507]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:01 compute-0 sudo[95659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhzkjbfrjcpclahmfqwhspovifvtdaye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224901.344515-634-184136727016048/AnsiballZ_command.py'
Jan 12 13:35:01 compute-0 sudo[95659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:01 compute-0 python3.9[95661]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:35:01 compute-0 ovs-vsctl[95662]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 12 13:35:01 compute-0 sudo[95659]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:01 compute-0 sudo[95812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uosghuivdbvmjellrgpirndodrduhrpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224901.8193197-642-132121751090503/AnsiballZ_command.py'
Jan 12 13:35:01 compute-0 sudo[95812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:02 compute-0 python3.9[95814]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:35:02 compute-0 ovs-vsctl[95816]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 12 13:35:02 compute-0 sudo[95812]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:02 compute-0 sudo[95967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkqbmbkiiwpvjgclocvnmnhmhobdkrip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224902.4475322-656-84068266117280/AnsiballZ_command.py'
Jan 12 13:35:02 compute-0 sudo[95967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:02 compute-0 python3.9[95969]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:35:02 compute-0 ovs-vsctl[95970]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 12 13:35:02 compute-0 sudo[95967]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:03 compute-0 sshd-session[84514]: Connection closed by 192.168.122.30 port 33842
Jan 12 13:35:03 compute-0 sshd-session[84511]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:35:03 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Jan 12 13:35:03 compute-0 systemd[1]: session-19.scope: Consumed 31.379s CPU time.
Jan 12 13:35:03 compute-0 systemd-logind[775]: Session 19 logged out. Waiting for processes to exit.
Jan 12 13:35:03 compute-0 systemd-logind[775]: Removed session 19.
Jan 12 13:35:08 compute-0 sshd-session[95995]: Accepted publickey for zuul from 192.168.122.30 port 59752 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:35:08 compute-0 systemd-logind[775]: New session 21 of user zuul.
Jan 12 13:35:08 compute-0 systemd[1]: Started Session 21 of User zuul.
Jan 12 13:35:08 compute-0 sshd-session[95995]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:35:09 compute-0 python3.9[96148]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:35:09 compute-0 systemd[1]: Stopping User Manager for UID 0...
Jan 12 13:35:09 compute-0 systemd[95004]: Activating special unit Exit the Session...
Jan 12 13:35:09 compute-0 systemd[95004]: Stopped target Main User Target.
Jan 12 13:35:09 compute-0 systemd[95004]: Stopped target Basic System.
Jan 12 13:35:09 compute-0 systemd[95004]: Stopped target Paths.
Jan 12 13:35:09 compute-0 systemd[95004]: Stopped target Sockets.
Jan 12 13:35:09 compute-0 systemd[95004]: Stopped target Timers.
Jan 12 13:35:09 compute-0 systemd[95004]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 12 13:35:09 compute-0 systemd[95004]: Closed D-Bus User Message Bus Socket.
Jan 12 13:35:09 compute-0 systemd[95004]: Stopped Create User's Volatile Files and Directories.
Jan 12 13:35:09 compute-0 systemd[95004]: Removed slice User Application Slice.
Jan 12 13:35:09 compute-0 systemd[95004]: Reached target Shutdown.
Jan 12 13:35:09 compute-0 systemd[95004]: Finished Exit the Session.
Jan 12 13:35:09 compute-0 systemd[95004]: Reached target Exit the Session.
Jan 12 13:35:09 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Jan 12 13:35:09 compute-0 systemd[1]: Stopped User Manager for UID 0.
Jan 12 13:35:09 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 12 13:35:09 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 12 13:35:09 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 12 13:35:09 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 12 13:35:09 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Jan 12 13:35:09 compute-0 sudo[96303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sirgutrngdgycrnzbznubfbrijwwkwhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224909.4244845-29-205457145771542/AnsiballZ_file.py'
Jan 12 13:35:09 compute-0 sudo[96303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:09 compute-0 python3.9[96305]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:09 compute-0 sudo[96303]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:10 compute-0 sudo[96455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usqztlpuwtqlffehofmjtmdakrudvavc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224909.9879158-29-261023406103560/AnsiballZ_file.py'
Jan 12 13:35:10 compute-0 sudo[96455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:10 compute-0 python3.9[96457]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:10 compute-0 sudo[96455]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:10 compute-0 sudo[96607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiecdkykqsufowbejzyavmjiimcayofh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224910.4191995-29-203330038255658/AnsiballZ_file.py'
Jan 12 13:35:10 compute-0 sudo[96607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:10 compute-0 python3.9[96609]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:10 compute-0 sudo[96607]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:11 compute-0 sudo[96759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqzqiaxhcgoloemxpnzabwtsqehdeyyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224910.8557477-29-269950200579272/AnsiballZ_file.py'
Jan 12 13:35:11 compute-0 sudo[96759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:11 compute-0 python3.9[96761]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:11 compute-0 sudo[96759]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:11 compute-0 sudo[96911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkpfkcnsydulbvwvccqtkxxlubvfkrmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224911.279061-29-184663913961127/AnsiballZ_file.py'
Jan 12 13:35:11 compute-0 sudo[96911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:11 compute-0 python3.9[96913]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:11 compute-0 sudo[96911]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:12 compute-0 python3.9[97063]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:35:12 compute-0 sudo[97213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeafspossflofrykmghekwrfxamdshdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224912.247561-73-15238484685518/AnsiballZ_seboolean.py'
Jan 12 13:35:12 compute-0 sudo[97213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:12 compute-0 python3.9[97215]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 12 13:35:13 compute-0 sudo[97213]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:13 compute-0 python3.9[97365]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:14 compute-0 python3.9[97486]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224913.307201-81-112499721848610/.source follow=False _original_basename=haproxy.j2 checksum=1daf285be4abb25cbd7ba376734de140aac9aefe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:14 compute-0 python3.9[97636]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:15 compute-0 python3.9[97757]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224914.3856711-96-179691112929173/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:15 compute-0 sudo[97907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijpunfnxyebehyqpyolurzoaormzzawp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224915.2528799-113-184281474941695/AnsiballZ_setup.py'
Jan 12 13:35:15 compute-0 sudo[97907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:15 compute-0 python3.9[97909]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:35:15 compute-0 sudo[97907]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:16 compute-0 sudo[97991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybzlvvsfpescoorbxgyotseysqzbfhlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224915.2528799-113-184281474941695/AnsiballZ_dnf.py'
Jan 12 13:35:16 compute-0 sudo[97991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:16 compute-0 python3.9[97993]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:35:17 compute-0 sudo[97991]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:17 compute-0 sudo[98145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzjtnhvhtkqkhbklckxlewuwjlieoehw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224917.365536-125-269053820927403/AnsiballZ_systemd.py'
Jan 12 13:35:17 compute-0 sudo[98145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:18 compute-0 python3.9[98147]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 12 13:35:18 compute-0 sudo[98145]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:18 compute-0 python3.9[98300]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:18 compute-0 python3.9[98421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224918.1823208-133-220269958585835/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:19 compute-0 python3.9[98571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:19 compute-0 python3.9[98692]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224918.9417553-133-17212961552982/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:20 compute-0 python3.9[98842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:20 compute-0 python3.9[98963]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224920.1169484-177-54618035520256/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:21 compute-0 python3.9[99113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:21 compute-0 python3.9[99234]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224920.8542428-177-234155858653071/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:21 compute-0 python3.9[99384]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:35:22 compute-0 sudo[99536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgxykcsccwlhwzjgcxreopggbwxmlcsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224922.0435834-215-224274022864495/AnsiballZ_file.py'
Jan 12 13:35:22 compute-0 sudo[99536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:22 compute-0 python3.9[99538]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:22 compute-0 sudo[99536]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:22 compute-0 sudo[99688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jveedwiwioubvimiclrwkdfvjokaiyfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224922.4699006-223-61821032148363/AnsiballZ_stat.py'
Jan 12 13:35:22 compute-0 sudo[99688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:22 compute-0 python3.9[99690]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:22 compute-0 sudo[99688]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:22 compute-0 sudo[99766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aivueilpvredckkvnpcohzurjuyhqgos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224922.4699006-223-61821032148363/AnsiballZ_file.py'
Jan 12 13:35:22 compute-0 sudo[99766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:23 compute-0 python3.9[99768]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:23 compute-0 sudo[99766]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:23 compute-0 sudo[99918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwhlguveekhjixfxnouksuusyuxhmluo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224923.1859004-223-37347002474005/AnsiballZ_stat.py'
Jan 12 13:35:23 compute-0 sudo[99918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:23 compute-0 python3.9[99920]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:23 compute-0 sudo[99918]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:23 compute-0 sudo[99996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdsyevfkfueygtnscidaogbqjkaiucff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224923.1859004-223-37347002474005/AnsiballZ_file.py'
Jan 12 13:35:23 compute-0 sudo[99996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:23 compute-0 python3.9[99998]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:23 compute-0 sudo[99996]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:24 compute-0 sudo[100148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyhjdwqcnojoataonkkcdedhrvwqfmeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224923.9208267-246-127311929431836/AnsiballZ_file.py'
Jan 12 13:35:24 compute-0 sudo[100148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:24 compute-0 python3.9[100150]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:24 compute-0 sudo[100148]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:24 compute-0 sudo[100300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svdilgtthpjpknvmoddhjfsbmoqpcdeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224924.3650887-254-145948870203581/AnsiballZ_stat.py'
Jan 12 13:35:24 compute-0 sudo[100300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:24 compute-0 python3.9[100302]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:24 compute-0 sudo[100300]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:24 compute-0 sudo[100378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvevmqfmzekvgswhbyqsvqpnfryntuus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224924.3650887-254-145948870203581/AnsiballZ_file.py'
Jan 12 13:35:24 compute-0 sudo[100378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:25 compute-0 python3.9[100380]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:25 compute-0 sudo[100378]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:25 compute-0 sudo[100530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwqnproeqzscjvatazmddtcahylovqvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224925.1210039-266-93439748999826/AnsiballZ_stat.py'
Jan 12 13:35:25 compute-0 sudo[100530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:25 compute-0 python3.9[100532]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:25 compute-0 sudo[100530]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:25 compute-0 sudo[100608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wolbwbrajjetdsxqqvxodwjcprpwaznw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224925.1210039-266-93439748999826/AnsiballZ_file.py'
Jan 12 13:35:25 compute-0 sudo[100608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:25 compute-0 python3.9[100610]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:25 compute-0 sudo[100608]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:26 compute-0 sudo[100760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekrgcusckcznrykymejczjjwvivsyafb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224925.8773277-278-243497866595857/AnsiballZ_systemd.py'
Jan 12 13:35:26 compute-0 sudo[100760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:26 compute-0 python3.9[100762]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:26 compute-0 systemd[1]: Reloading.
Jan 12 13:35:26 compute-0 systemd-rc-local-generator[100782]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:35:26 compute-0 systemd-sysv-generator[100786]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:35:26 compute-0 sudo[100760]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:26 compute-0 sudo[100949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gagrjrkhiegvdlbcvigiutvnjzpkhxbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224926.6014428-286-245763672018265/AnsiballZ_stat.py'
Jan 12 13:35:26 compute-0 sudo[100949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:26 compute-0 python3.9[100951]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:26 compute-0 sudo[100949]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:27 compute-0 sudo[101027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnbzrbzvccazqauravvkmrewdzevdrva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224926.6014428-286-245763672018265/AnsiballZ_file.py'
Jan 12 13:35:27 compute-0 sudo[101027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:27 compute-0 python3.9[101029]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:27 compute-0 sudo[101027]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:27 compute-0 sudo[101179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svlhwmgmdiucjumaxxdxyiirgzbnzwfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224927.362614-298-250697813928880/AnsiballZ_stat.py'
Jan 12 13:35:27 compute-0 sudo[101179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:27 compute-0 python3.9[101181]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:27 compute-0 sudo[101179]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:27 compute-0 sudo[101257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blntsixkmswcqwrbvnyigeknxpuuzvac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224927.362614-298-250697813928880/AnsiballZ_file.py'
Jan 12 13:35:27 compute-0 sudo[101257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:27 compute-0 python3.9[101259]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:27 compute-0 sudo[101257]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:28 compute-0 sudo[101409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaufqppiknahlftsspmyqkybkierosyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224928.09814-310-149721030037618/AnsiballZ_systemd.py'
Jan 12 13:35:28 compute-0 sudo[101409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:28 compute-0 python3.9[101411]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:28 compute-0 systemd[1]: Reloading.
Jan 12 13:35:28 compute-0 systemd-rc-local-generator[101436]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:35:28 compute-0 systemd-sysv-generator[101439]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:35:28 compute-0 systemd[1]: Starting Create netns directory...
Jan 12 13:35:28 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 12 13:35:28 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 12 13:35:28 compute-0 systemd[1]: Finished Create netns directory.
Jan 12 13:35:28 compute-0 sudo[101409]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:29 compute-0 sudo[101602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvsuunisabkecsztwmddxwjrxizlrgop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224928.912991-320-47439931911109/AnsiballZ_file.py'
Jan 12 13:35:29 compute-0 sudo[101602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:29 compute-0 python3.9[101604]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:29 compute-0 sudo[101602]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:29 compute-0 sudo[101754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brbhufloatzhdqzuwbdfohwwocizfkul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224929.3510094-328-145057505898132/AnsiballZ_stat.py'
Jan 12 13:35:29 compute-0 sudo[101754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:29 compute-0 python3.9[101756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:29 compute-0 sudo[101754]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:29 compute-0 sudo[101884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqxtlsrcavulptxnklzxhnwtmfqpygon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224929.3510094-328-145057505898132/AnsiballZ_copy.py'
Jan 12 13:35:29 compute-0 sudo[101884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:29 compute-0 ovn_controller[94974]: 2026-01-12T13:35:29Z|00025|memory|INFO|16384 kB peak resident set size after 30.3 seconds
Jan 12 13:35:29 compute-0 ovn_controller[94974]: 2026-01-12T13:35:29Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Jan 12 13:35:29 compute-0 podman[101851]: 2026-01-12 13:35:29.954693673 +0000 UTC m=+0.077570715 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:35:30 compute-0 python3.9[101888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768224929.3510094-328-145057505898132/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:30 compute-0 sudo[101884]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:30 compute-0 sudo[102052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxgikyznkzgysrbhcpekzmnmvwsyhzvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224930.3229454-345-204046061803732/AnsiballZ_file.py'
Jan 12 13:35:30 compute-0 sudo[102052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:30 compute-0 python3.9[102054]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:30 compute-0 sudo[102052]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:30 compute-0 sudo[102204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzbmcoglgjuephllfrtbjyxxxhosevjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224930.791252-353-10115135498435/AnsiballZ_file.py'
Jan 12 13:35:30 compute-0 sudo[102204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:31 compute-0 python3.9[102206]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:35:31 compute-0 sudo[102204]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:31 compute-0 sudo[102356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzxrbcncwrsxpovlhokmzuyiolgklsal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224931.2584364-361-164188567320111/AnsiballZ_stat.py'
Jan 12 13:35:31 compute-0 sudo[102356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:31 compute-0 python3.9[102358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:31 compute-0 sudo[102356]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:31 compute-0 sudo[102479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwcibbclduiostuvdoqulaemhixzszdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224931.2584364-361-164188567320111/AnsiballZ_copy.py'
Jan 12 13:35:31 compute-0 sudo[102479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:31 compute-0 python3.9[102481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224931.2584364-361-164188567320111/.source.json _original_basename=.i8sz4aoe follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:31 compute-0 sudo[102479]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:32 compute-0 python3.9[102631]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:33 compute-0 sudo[103052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdupbjfqamrbpauryjccdwlqorbzsedh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224933.4686148-401-6137569802475/AnsiballZ_container_config_data.py'
Jan 12 13:35:33 compute-0 sudo[103052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:33 compute-0 python3.9[103054]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 12 13:35:33 compute-0 sudo[103052]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:34 compute-0 sudo[103204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yislqntevzbxwzbunenhgeruntlrxkxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224934.14962-412-23067527980186/AnsiballZ_container_config_hash.py'
Jan 12 13:35:34 compute-0 sudo[103204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:34 compute-0 python3.9[103206]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 12 13:35:34 compute-0 sudo[103204]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:35 compute-0 sudo[103356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggmykvxdylddatdymtelvylcdfxrhpgj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768224934.827904-422-252219796015579/AnsiballZ_edpm_container_manage.py'
Jan 12 13:35:35 compute-0 sudo[103356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:35 compute-0 python3[103358]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 12 13:35:35 compute-0 podman[103385]: 2026-01-12 13:35:35.493015325 +0000 UTC m=+0.026821732 container create 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:35:35 compute-0 podman[103385]: 2026-01-12 13:35:35.480548549 +0000 UTC m=+0.014354976 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:35:35 compute-0 python3[103358]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:35:35 compute-0 sudo[103356]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:35 compute-0 sudo[103562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxuwkmjmbsbqhaljbaynbgluxphjrong ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224935.6851184-430-201350501496341/AnsiballZ_stat.py'
Jan 12 13:35:35 compute-0 sudo[103562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:36 compute-0 python3.9[103564]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:35:36 compute-0 sudo[103562]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:36 compute-0 sudo[103716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpbaulimeniusjkvdcrimpwjjockeiqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224936.1909614-439-198602635088474/AnsiballZ_file.py'
Jan 12 13:35:36 compute-0 sudo[103716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:36 compute-0 python3.9[103718]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:36 compute-0 sudo[103716]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:36 compute-0 sudo[103792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykulmeqdjraykyekvemwocxtmsjmitwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224936.1909614-439-198602635088474/AnsiballZ_stat.py'
Jan 12 13:35:36 compute-0 sudo[103792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:36 compute-0 python3.9[103794]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:35:36 compute-0 sudo[103792]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:37 compute-0 sudo[103943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpqhbzjrlpdvsdpegglnlcxtiekgjafz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224936.859879-439-126836566107541/AnsiballZ_copy.py'
Jan 12 13:35:37 compute-0 sudo[103943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:37 compute-0 python3.9[103945]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768224936.859879-439-126836566107541/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:37 compute-0 sudo[103943]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:37 compute-0 sudo[104019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjmftshsavxogwbgsbrfbmnetgveakqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224936.859879-439-126836566107541/AnsiballZ_systemd.py'
Jan 12 13:35:37 compute-0 sudo[104019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:37 compute-0 python3.9[104021]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:35:37 compute-0 systemd[1]: Reloading.
Jan 12 13:35:37 compute-0 systemd-rc-local-generator[104042]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:35:37 compute-0 systemd-sysv-generator[104045]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:35:37 compute-0 sudo[104019]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:38 compute-0 sudo[104129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muuxrvuwvesqldfvsnmngozjccrfywhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224936.859879-439-126836566107541/AnsiballZ_systemd.py'
Jan 12 13:35:38 compute-0 sudo[104129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:38 compute-0 python3.9[104131]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:38 compute-0 systemd[1]: Reloading.
Jan 12 13:35:38 compute-0 systemd-sysv-generator[104160]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:35:38 compute-0 systemd-rc-local-generator[104157]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:35:38 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Jan 12 13:35:38 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:35:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/557abbeabff5e8eeee7b33f3e16b05bf86545aaaf41a060c49be64b89b9b04cc/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 12 13:35:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/557abbeabff5e8eeee7b33f3e16b05bf86545aaaf41a060c49be64b89b9b04cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:35:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184.
Jan 12 13:35:38 compute-0 podman[104172]: 2026-01-12 13:35:38.591658344 +0000 UTC m=+0.074444589 container init 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: + sudo -E kolla_set_configs
Jan 12 13:35:38 compute-0 podman[104172]: 2026-01-12 13:35:38.610311478 +0000 UTC m=+0.093097725 container start 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:35:38 compute-0 edpm-start-podman-container[104172]: ovn_metadata_agent
Jan 12 13:35:38 compute-0 edpm-start-podman-container[104171]: Creating additional drop-in dependency for "ovn_metadata_agent" (58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184)
Jan 12 13:35:38 compute-0 podman[104190]: 2026-01-12 13:35:38.664468627 +0000 UTC m=+0.045795782 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Validating config file
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Copying service configuration files
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Writing out command to execute
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: ++ cat /run_command
Jan 12 13:35:38 compute-0 systemd[1]: Reloading.
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: + CMD=neutron-ovn-metadata-agent
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: + ARGS=
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: + sudo kolla_copy_cacerts
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: + [[ ! -n '' ]]
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: + . kolla_extend_start
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: Running command: 'neutron-ovn-metadata-agent'
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: + umask 0022
Jan 12 13:35:38 compute-0 ovn_metadata_agent[104184]: + exec neutron-ovn-metadata-agent
Jan 12 13:35:38 compute-0 systemd-rc-local-generator[104248]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:35:38 compute-0 systemd-sysv-generator[104252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:35:38 compute-0 systemd[1]: Started ovn_metadata_agent container.
Jan 12 13:35:38 compute-0 sudo[104129]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:39 compute-0 python3.9[104416]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 12 13:35:39 compute-0 sudo[104566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycfzqpxnzfaizafznxkzbpvycslbmxud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224939.7718768-484-51413843641303/AnsiballZ_stat.py'
Jan 12 13:35:39 compute-0 sudo[104566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:40 compute-0 python3.9[104568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:35:40 compute-0 sudo[104566]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.147 104189 INFO neutron.common.config [-] Logging enabled!
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.147 104189 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.148 104189 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.148 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.148 104189 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.148 104189 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.148 104189 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.148 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.149 104189 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.150 104189 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.150 104189 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.150 104189 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.150 104189 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.150 104189 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.150 104189 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.150 104189 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.150 104189 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.150 104189 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.151 104189 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.151 104189 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.151 104189 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.151 104189 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.151 104189 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.151 104189 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.151 104189 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.151 104189 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.151 104189 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.152 104189 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.153 104189 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.154 104189 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.154 104189 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.154 104189 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.154 104189 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.154 104189 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.154 104189 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.154 104189 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.154 104189 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.154 104189 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.155 104189 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.155 104189 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.155 104189 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.155 104189 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.155 104189 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.155 104189 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.155 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.155 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.155 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.156 104189 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.157 104189 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.158 104189 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.159 104189 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.159 104189 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.159 104189 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.159 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.159 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.159 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.159 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.159 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.159 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.160 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.160 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.160 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.160 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.160 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.160 104189 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.160 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.160 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.160 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.161 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.161 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.161 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.161 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.161 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.161 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.161 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.161 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.161 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.162 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.162 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.162 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.162 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.162 104189 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.162 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.162 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.162 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.162 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.163 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.164 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.164 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.164 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.164 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.164 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.164 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.164 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.164 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.164 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.165 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.166 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.166 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.166 104189 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.166 104189 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.166 104189 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.166 104189 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.166 104189 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.166 104189 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.166 104189 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.167 104189 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.168 104189 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.168 104189 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.168 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.168 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.168 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.168 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.168 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.168 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.168 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.169 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.170 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.170 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.170 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.170 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.170 104189 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.170 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.170 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.170 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.170 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.171 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.172 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.172 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.172 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.172 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.172 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.172 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.172 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.172 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.172 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.173 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.174 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.174 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.174 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.174 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.174 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.174 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.174 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.174 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.174 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.175 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.175 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.175 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.175 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.175 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.175 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.175 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.175 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.175 104189 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.176 104189 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.176 104189 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.176 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.176 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.176 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.176 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.176 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.176 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.176 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.177 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.178 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.178 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.178 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.178 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.178 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.178 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.178 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.178 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.179 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.179 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.179 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.179 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.179 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.179 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.179 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.179 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.179 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.180 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.180 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.180 104189 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.180 104189 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.187 104189 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.187 104189 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.188 104189 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.188 104189 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.188 104189 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.198 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 9c2d4250-79a9-4504-9090-d7395fcb2080 (UUID: 9c2d4250-79a9-4504-9090-d7395fcb2080) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.225 104189 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.225 104189 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.225 104189 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.225 104189 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.228 104189 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.233 104189 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.237 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '9c2d4250-79a9-4504-9090-d7395fcb2080'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], external_ids={}, name=9c2d4250-79a9-4504-9090-d7395fcb2080, nb_cfg_timestamp=1768224907664, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.237 104189 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fe37841e0d0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.238 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.238 104189 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.238 104189 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.238 104189 INFO oslo_service.service [-] Starting 1 workers
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.242 104189 DEBUG oslo_service.service [-] Started child 104623 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.244 104189 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpzlrv6hzc/privsep.sock']
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.245 104623 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-166181'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.262 104623 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.263 104623 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.263 104623 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.265 104623 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.269 104623 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.273 104623 INFO eventlet.wsgi.server [-] (104623) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 12 13:35:40 compute-0 sudo[104695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbrrrvluwqtzxtgmhmoakrbqibolkkin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224939.7718768-484-51413843641303/AnsiballZ_copy.py'
Jan 12 13:35:40 compute-0 sudo[104695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:40 compute-0 python3.9[104697]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768224939.7718768-484-51413843641303/.source.yaml _original_basename=.32rn_mb1 follow=False checksum=01d491c4b9bc9b8e91611198d683fa52b389a2ea backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:40 compute-0 sudo[104695]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:40 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.762 104189 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.763 104189 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpzlrv6hzc/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.687 104723 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.690 104723 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.692 104723 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.692 104723 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104723
Jan 12 13:35:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:40.765 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[6d417ef4-9a6d-4b51-98bf-4888b24cdee4]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:35:40 compute-0 sshd-session[95998]: Connection closed by 192.168.122.30 port 59752
Jan 12 13:35:40 compute-0 sshd-session[95995]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:35:40 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Jan 12 13:35:40 compute-0 systemd[1]: session-21.scope: Consumed 23.901s CPU time.
Jan 12 13:35:40 compute-0 systemd-logind[775]: Session 21 logged out. Waiting for processes to exit.
Jan 12 13:35:40 compute-0 systemd-logind[775]: Removed session 21.
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.161 104723 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.161 104723 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.161 104723 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.588 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[c9eb78fa-46f0-4ca0-8f5f-b7575323a2ca]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.590 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, column=external_ids, values=({'neutron:ovn-metadata-id': '7fdab474-308a-5f83-a04b-59083921915c'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.597 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.601 104189 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.601 104189 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.602 104189 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.602 104189 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.602 104189 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.602 104189 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.602 104189 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.602 104189 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.602 104189 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.603 104189 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.603 104189 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.603 104189 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.603 104189 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.603 104189 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.603 104189 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.604 104189 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.604 104189 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.604 104189 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.604 104189 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.604 104189 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.604 104189 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.604 104189 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.604 104189 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.605 104189 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.605 104189 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.605 104189 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.605 104189 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.605 104189 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.605 104189 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.606 104189 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.606 104189 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.606 104189 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.606 104189 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.606 104189 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.606 104189 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.607 104189 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.607 104189 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.607 104189 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.607 104189 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.607 104189 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.607 104189 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.607 104189 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.608 104189 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.608 104189 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.608 104189 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.608 104189 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.608 104189 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.608 104189 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.608 104189 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.609 104189 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.609 104189 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.609 104189 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.609 104189 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.609 104189 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.609 104189 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.609 104189 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.610 104189 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.610 104189 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.610 104189 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.610 104189 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.610 104189 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.610 104189 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.610 104189 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.611 104189 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.611 104189 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.611 104189 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.611 104189 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.611 104189 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.611 104189 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.611 104189 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.612 104189 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.612 104189 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.612 104189 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.612 104189 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.612 104189 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.612 104189 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.612 104189 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.613 104189 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.613 104189 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.613 104189 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.613 104189 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.613 104189 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.613 104189 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.613 104189 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.614 104189 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.614 104189 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.614 104189 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.614 104189 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.614 104189 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.614 104189 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.614 104189 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.615 104189 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.615 104189 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.615 104189 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.615 104189 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.615 104189 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.615 104189 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.615 104189 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.615 104189 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.616 104189 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.616 104189 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.616 104189 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.616 104189 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.616 104189 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.616 104189 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.616 104189 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.617 104189 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.617 104189 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.617 104189 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.617 104189 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.617 104189 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.618 104189 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.618 104189 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.618 104189 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.618 104189 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.618 104189 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.618 104189 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.618 104189 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.619 104189 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.619 104189 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.619 104189 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.619 104189 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.619 104189 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.619 104189 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.619 104189 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.620 104189 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.620 104189 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.620 104189 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.620 104189 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.620 104189 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.620 104189 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.620 104189 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.621 104189 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.621 104189 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.621 104189 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.621 104189 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.621 104189 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.621 104189 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.622 104189 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.622 104189 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.622 104189 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.622 104189 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.622 104189 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.622 104189 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.622 104189 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.623 104189 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.623 104189 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.623 104189 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.623 104189 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.623 104189 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.623 104189 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.623 104189 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.624 104189 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.624 104189 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.624 104189 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.624 104189 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.624 104189 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.624 104189 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.624 104189 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.624 104189 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.625 104189 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.625 104189 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.625 104189 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.625 104189 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.625 104189 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.625 104189 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.625 104189 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.626 104189 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.626 104189 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.626 104189 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.626 104189 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.626 104189 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.626 104189 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.626 104189 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.627 104189 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.627 104189 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.627 104189 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.627 104189 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.627 104189 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.627 104189 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.627 104189 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.627 104189 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.628 104189 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.628 104189 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.628 104189 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.628 104189 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.628 104189 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.628 104189 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.628 104189 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.629 104189 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.629 104189 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.629 104189 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.629 104189 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.629 104189 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.629 104189 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.629 104189 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.630 104189 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.630 104189 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.630 104189 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.630 104189 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.630 104189 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.630 104189 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.630 104189 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.631 104189 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.631 104189 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.631 104189 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.631 104189 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.631 104189 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.631 104189 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.631 104189 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.632 104189 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.632 104189 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.632 104189 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.632 104189 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.632 104189 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.632 104189 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.632 104189 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.632 104189 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.633 104189 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.633 104189 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.633 104189 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.633 104189 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.633 104189 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.633 104189 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.633 104189 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.634 104189 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.634 104189 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.634 104189 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.634 104189 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.634 104189 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.634 104189 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.634 104189 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.634 104189 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.635 104189 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.635 104189 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.635 104189 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.635 104189 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.635 104189 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.635 104189 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.635 104189 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.636 104189 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.636 104189 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.636 104189 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.636 104189 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.636 104189 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.636 104189 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.636 104189 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.637 104189 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.637 104189 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.637 104189 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.637 104189 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.637 104189 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.637 104189 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.637 104189 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.638 104189 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.638 104189 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.638 104189 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.638 104189 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.638 104189 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.638 104189 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.638 104189 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.638 104189 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.639 104189 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.639 104189 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.639 104189 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.639 104189 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.639 104189 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.639 104189 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.639 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.640 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.640 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.640 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.640 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.640 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.640 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.641 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.641 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.641 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.641 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.641 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.641 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.641 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.642 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.642 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.642 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.642 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.642 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.642 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.642 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.642 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.642 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.643 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.643 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.643 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.643 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.643 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.643 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.643 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.643 104189 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.643 104189 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.644 104189 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.644 104189 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.644 104189 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:35:41 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:35:41.644 104189 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 12 13:35:45 compute-0 sshd-session[104728]: Accepted publickey for zuul from 192.168.122.30 port 58208 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:35:45 compute-0 systemd-logind[775]: New session 22 of user zuul.
Jan 12 13:35:45 compute-0 systemd[1]: Started Session 22 of User zuul.
Jan 12 13:35:45 compute-0 sshd-session[104728]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:35:46 compute-0 python3.9[104881]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:35:47 compute-0 sudo[105035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcbnmhpayxzwkhyemlrkprfmjfjxtbvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224946.791897-29-253878086984644/AnsiballZ_command.py'
Jan 12 13:35:47 compute-0 sudo[105035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:47 compute-0 python3.9[105037]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:35:47 compute-0 sudo[105035]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:47 compute-0 sudo[105196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeozqsvfmlpsmdsfrdqfscvmylppoqni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224947.485445-40-232569438767674/AnsiballZ_systemd_service.py'
Jan 12 13:35:47 compute-0 sudo[105196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:48 compute-0 python3.9[105198]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:35:48 compute-0 systemd[1]: Reloading.
Jan 12 13:35:48 compute-0 systemd-rc-local-generator[105219]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:35:48 compute-0 systemd-sysv-generator[105222]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:35:48 compute-0 sudo[105196]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:48 compute-0 python3.9[105383]: ansible-ansible.builtin.service_facts Invoked
Jan 12 13:35:48 compute-0 network[105400]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 12 13:35:48 compute-0 network[105401]: 'network-scripts' will be removed from distribution in near future.
Jan 12 13:35:48 compute-0 network[105402]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 12 13:35:51 compute-0 sudo[105661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxusfeuqvudkygnvvbzeapcywgncqqei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224950.8248932-59-222018112853269/AnsiballZ_systemd_service.py'
Jan 12 13:35:51 compute-0 sudo[105661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:51 compute-0 python3.9[105663]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:51 compute-0 sudo[105661]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:51 compute-0 sudo[105814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sguetkqgrvzjelepfiscataqdckhjgei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224951.3545647-59-192539545228996/AnsiballZ_systemd_service.py'
Jan 12 13:35:51 compute-0 sudo[105814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:51 compute-0 python3.9[105816]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:51 compute-0 sudo[105814]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:52 compute-0 sudo[105967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwgmloeogkbsjryfizycekarvljhkkaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224951.8727663-59-48640068537204/AnsiballZ_systemd_service.py'
Jan 12 13:35:52 compute-0 sudo[105967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:52 compute-0 python3.9[105969]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:52 compute-0 sudo[105967]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:52 compute-0 sudo[106120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkfxmtgqqxnwbospwldosvkxzqrpihfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224952.404616-59-16321755507033/AnsiballZ_systemd_service.py'
Jan 12 13:35:52 compute-0 sudo[106120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:52 compute-0 python3.9[106122]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:52 compute-0 sudo[106120]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:53 compute-0 sudo[106273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcpjusaeuceumdboalojdusfqcyldcpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224952.918724-59-120026081060954/AnsiballZ_systemd_service.py'
Jan 12 13:35:53 compute-0 sudo[106273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:53 compute-0 python3.9[106275]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:53 compute-0 sudo[106273]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:53 compute-0 sudo[106426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nykewlfxnxsrxtzctkdvluztptgnpnwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224953.4321594-59-217524351100227/AnsiballZ_systemd_service.py'
Jan 12 13:35:53 compute-0 sudo[106426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:53 compute-0 python3.9[106428]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:53 compute-0 sudo[106426]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:54 compute-0 sudo[106579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imxcvopttqcmcorpytroabxjpcmucyjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224953.9613347-59-189077945053687/AnsiballZ_systemd_service.py'
Jan 12 13:35:54 compute-0 sudo[106579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:54 compute-0 python3.9[106581]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:35:54 compute-0 sudo[106579]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:55 compute-0 sudo[106732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzchvjzvitfeozbjkboglmnrttojkmaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224954.7142103-111-60166099155567/AnsiballZ_file.py'
Jan 12 13:35:55 compute-0 sudo[106732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:55 compute-0 python3.9[106734]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:55 compute-0 sudo[106732]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:55 compute-0 sudo[106884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jllxkuiwykkmqrwkxkfoafzrswkewckz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224955.2529016-111-123835466120052/AnsiballZ_file.py'
Jan 12 13:35:55 compute-0 sudo[106884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:55 compute-0 python3.9[106886]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:55 compute-0 sudo[106884]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:55 compute-0 sudo[107036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzzngoeschdhxywclwuhqcpsragnflpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224955.6789713-111-31103635965533/AnsiballZ_file.py'
Jan 12 13:35:55 compute-0 sudo[107036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:55 compute-0 python3.9[107038]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:56 compute-0 sudo[107036]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:56 compute-0 sudo[107188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqhmeimdbgdcironslczwzzduipdlcmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224956.0921311-111-131946708511401/AnsiballZ_file.py'
Jan 12 13:35:56 compute-0 sudo[107188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:56 compute-0 python3.9[107190]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:56 compute-0 sudo[107188]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:56 compute-0 sudo[107340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozijmjwbfzgwtrhjrthrmubwzvplvokf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224956.5104-111-35733120412657/AnsiballZ_file.py'
Jan 12 13:35:56 compute-0 sudo[107340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:56 compute-0 python3.9[107342]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:56 compute-0 sudo[107340]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:57 compute-0 sudo[107492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hixopcolckbbwqhfpvsalhczkouyywpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224956.932918-111-20625417833255/AnsiballZ_file.py'
Jan 12 13:35:57 compute-0 sudo[107492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:57 compute-0 python3.9[107494]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:57 compute-0 sudo[107492]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:57 compute-0 sudo[107644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxxzdoqrbflegdfzotqzdwcewapdelmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224957.3362417-111-8500237443255/AnsiballZ_file.py'
Jan 12 13:35:57 compute-0 sudo[107644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:57 compute-0 python3.9[107646]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:57 compute-0 sudo[107644]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:57 compute-0 sudo[107796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foccyzucpfxaqwkvxbpcbpqxfekhwftf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224957.7858827-161-259135622086261/AnsiballZ_file.py'
Jan 12 13:35:57 compute-0 sudo[107796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:58 compute-0 python3.9[107798]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:58 compute-0 sudo[107796]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:58 compute-0 sudo[107948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpxzmfnyjxzwhorqebdvrerlgmhloive ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224958.2065017-161-260770077857367/AnsiballZ_file.py'
Jan 12 13:35:58 compute-0 sudo[107948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:58 compute-0 python3.9[107950]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:58 compute-0 sudo[107948]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:58 compute-0 sudo[108100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auvbttwxkeivcnzjmikryudtcosviksm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224958.6269877-161-233358293445987/AnsiballZ_file.py'
Jan 12 13:35:58 compute-0 sudo[108100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:59 compute-0 python3.9[108102]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:59 compute-0 sudo[108100]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:59 compute-0 sudo[108252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acixbnlidnifeolurkemkwcnugrerkoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224959.164873-161-168716871266514/AnsiballZ_file.py'
Jan 12 13:35:59 compute-0 sudo[108252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:59 compute-0 python3.9[108254]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:59 compute-0 sudo[108252]: pam_unix(sudo:session): session closed for user root
Jan 12 13:35:59 compute-0 sudo[108404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvlzgfrpijoneekrdcyfhbhmbiglpwzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224959.5911937-161-71950368663170/AnsiballZ_file.py'
Jan 12 13:35:59 compute-0 sudo[108404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:35:59 compute-0 python3.9[108406]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:35:59 compute-0 sudo[108404]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:00 compute-0 sudo[108566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjxmdgckaooacaitalsouflymvffwvqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224960.0129247-161-84476761294282/AnsiballZ_file.py'
Jan 12 13:36:00 compute-0 sudo[108566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:00 compute-0 podman[108530]: 2026-01-12 13:36:00.233617391 +0000 UTC m=+0.063188670 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 12 13:36:00 compute-0 python3.9[108574]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:36:00 compute-0 sudo[108566]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:00 compute-0 sudo[108731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnagsrnebrfcvnqohmmyjyojvnlneirf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224960.457689-161-145659327778705/AnsiballZ_file.py'
Jan 12 13:36:00 compute-0 sudo[108731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:00 compute-0 python3.9[108733]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:36:00 compute-0 sudo[108731]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:01 compute-0 sudo[108883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqwxfazyswlalzptbvcfqbqeavojxenu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224960.975959-212-68465908997416/AnsiballZ_command.py'
Jan 12 13:36:01 compute-0 sudo[108883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:01 compute-0 python3.9[108885]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:36:01 compute-0 sudo[108883]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:01 compute-0 python3.9[109037]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 12 13:36:02 compute-0 sudo[109187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zezwykbqfoshodxawzjwfwpcuerxnvfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224962.1155455-230-24274443557464/AnsiballZ_systemd_service.py'
Jan 12 13:36:02 compute-0 sudo[109187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:02 compute-0 python3.9[109189]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:36:02 compute-0 systemd[1]: Reloading.
Jan 12 13:36:02 compute-0 systemd-rc-local-generator[109211]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:36:02 compute-0 systemd-sysv-generator[109214]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:36:02 compute-0 sudo[109187]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:03 compute-0 sudo[109376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qowfwvtelahbdktrmqpxxclwtwzgukye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224962.884575-238-128910189850167/AnsiballZ_command.py'
Jan 12 13:36:03 compute-0 sudo[109376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:03 compute-0 python3.9[109378]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:36:03 compute-0 sudo[109376]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:03 compute-0 sudo[109529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cengssibvbaeztskukbnudrakiamlgmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224963.4348972-238-217577294334693/AnsiballZ_command.py'
Jan 12 13:36:03 compute-0 sudo[109529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:03 compute-0 python3.9[109531]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:36:03 compute-0 sudo[109529]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:04 compute-0 sudo[109682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sujottxvngivfsxyvevkttozzghzegdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224963.8575754-238-38419219421093/AnsiballZ_command.py'
Jan 12 13:36:04 compute-0 sudo[109682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:04 compute-0 python3.9[109684]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:36:04 compute-0 sudo[109682]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:04 compute-0 sudo[109835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anjyvapydrjpjxkcleunofrhsgouwtmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224964.3035336-238-114769921464106/AnsiballZ_command.py'
Jan 12 13:36:04 compute-0 sudo[109835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:04 compute-0 python3.9[109837]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:36:04 compute-0 sudo[109835]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:04 compute-0 sudo[109988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuwdybxoagafdlxjsqbfxqvyinfhsugh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224964.7371428-238-217201235771457/AnsiballZ_command.py'
Jan 12 13:36:04 compute-0 sudo[109988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:05 compute-0 python3.9[109990]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:36:05 compute-0 sudo[109988]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:05 compute-0 sudo[110141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttntgcpbesfknsatvnswvqkzndnqdvmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224965.1553195-238-179085100550555/AnsiballZ_command.py'
Jan 12 13:36:05 compute-0 sudo[110141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:05 compute-0 python3.9[110143]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:36:05 compute-0 sudo[110141]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:05 compute-0 sudo[110294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iksisohkreyayspfkinykxhgkxprsdgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224965.592925-238-173225710480471/AnsiballZ_command.py'
Jan 12 13:36:05 compute-0 sudo[110294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:05 compute-0 python3.9[110296]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:36:05 compute-0 sudo[110294]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:06 compute-0 sudo[110447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prepqodyjsykyhknbdptrgqokfbjhsgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224966.2007356-292-178204947248418/AnsiballZ_getent.py'
Jan 12 13:36:06 compute-0 sudo[110447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:06 compute-0 python3.9[110449]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 12 13:36:06 compute-0 sudo[110447]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:07 compute-0 sudo[110600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrkqfojkahqjbhkfjejxfhbigfinspjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224966.8373055-300-276364584662914/AnsiballZ_group.py'
Jan 12 13:36:07 compute-0 sudo[110600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:07 compute-0 python3.9[110602]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 12 13:36:07 compute-0 groupadd[110603]: group added to /etc/group: name=libvirt, GID=42473
Jan 12 13:36:07 compute-0 groupadd[110603]: group added to /etc/gshadow: name=libvirt
Jan 12 13:36:07 compute-0 groupadd[110603]: new group: name=libvirt, GID=42473
Jan 12 13:36:07 compute-0 sudo[110600]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:07 compute-0 sudo[110758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uumywadqwjufcgbzvjrpfymjzdcikhsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224967.5526283-308-209844141267731/AnsiballZ_user.py'
Jan 12 13:36:07 compute-0 sudo[110758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:08 compute-0 python3.9[110760]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 12 13:36:08 compute-0 useradd[110762]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 12 13:36:08 compute-0 sudo[110758]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:08 compute-0 sudo[110918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnkickxcppixzchervlnluyjzhmbagzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224968.3866382-319-213655205682564/AnsiballZ_setup.py'
Jan 12 13:36:08 compute-0 sudo[110918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:08 compute-0 python3.9[110920]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:36:09 compute-0 sudo[110918]: pam_unix(sudo:session): session closed for user root
Jan 12 13:36:09 compute-0 sudo[111009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvjenuuhzwxlodxygznwwmicscflbmjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768224968.3866382-319-213655205682564/AnsiballZ_dnf.py'
Jan 12 13:36:09 compute-0 sudo[111009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:36:09 compute-0 podman[110976]: 2026-01-12 13:36:09.474750182 +0000 UTC m=+0.055524043 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 12 13:36:09 compute-0 python3.9[111011]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:36:30 compute-0 podman[111214]: 2026-01-12 13:36:30.561980361 +0000 UTC m=+0.056089556 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 12 13:36:32 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Jan 12 13:36:32 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 12 13:36:32 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 12 13:36:32 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 12 13:36:32 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 12 13:36:32 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 12 13:36:32 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 12 13:36:32 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 12 13:36:38 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Jan 12 13:36:38 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 12 13:36:38 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 12 13:36:38 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 12 13:36:38 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 12 13:36:38 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 12 13:36:38 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 12 13:36:38 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 12 13:36:39 compute-0 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 12 13:36:39 compute-0 podman[111253]: 2026-01-12 13:36:39.552479483 +0000 UTC m=+0.040796402 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:36:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:36:40.189 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:36:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:36:40.190 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:36:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:36:40.190 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:37:01 compute-0 podman[125714]: 2026-01-12 13:37:01.55527787 +0000 UTC m=+0.054139267 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:37:10 compute-0 podman[128144]: 2026-01-12 13:37:10.541758257 +0000 UTC m=+0.036757354 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 12 13:37:12 compute-0 kernel: SELinux:  Converting 2759 SID table entries...
Jan 12 13:37:12 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 12 13:37:12 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 12 13:37:12 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 12 13:37:12 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 12 13:37:12 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 12 13:37:12 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 12 13:37:12 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 12 13:37:13 compute-0 groupadd[128172]: group added to /etc/group: name=dnsmasq, GID=993
Jan 12 13:37:13 compute-0 groupadd[128172]: group added to /etc/gshadow: name=dnsmasq
Jan 12 13:37:13 compute-0 groupadd[128172]: new group: name=dnsmasq, GID=993
Jan 12 13:37:13 compute-0 useradd[128179]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 12 13:37:13 compute-0 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Jan 12 13:37:13 compute-0 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 12 13:37:13 compute-0 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Jan 12 13:37:14 compute-0 groupadd[128192]: group added to /etc/group: name=clevis, GID=992
Jan 12 13:37:14 compute-0 groupadd[128192]: group added to /etc/gshadow: name=clevis
Jan 12 13:37:14 compute-0 groupadd[128192]: new group: name=clevis, GID=992
Jan 12 13:37:14 compute-0 useradd[128199]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 12 13:37:14 compute-0 usermod[128209]: add 'clevis' to group 'tss'
Jan 12 13:37:14 compute-0 usermod[128209]: add 'clevis' to shadow group 'tss'
Jan 12 13:37:15 compute-0 polkitd[43250]: Reloading rules
Jan 12 13:37:15 compute-0 polkitd[43250]: Collecting garbage unconditionally...
Jan 12 13:37:15 compute-0 polkitd[43250]: Loading rules from directory /etc/polkit-1/rules.d
Jan 12 13:37:15 compute-0 polkitd[43250]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 12 13:37:15 compute-0 polkitd[43250]: Finished loading, compiling and executing 3 rules
Jan 12 13:37:15 compute-0 polkitd[43250]: Reloading rules
Jan 12 13:37:15 compute-0 polkitd[43250]: Collecting garbage unconditionally...
Jan 12 13:37:15 compute-0 polkitd[43250]: Loading rules from directory /etc/polkit-1/rules.d
Jan 12 13:37:15 compute-0 polkitd[43250]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 12 13:37:15 compute-0 polkitd[43250]: Finished loading, compiling and executing 3 rules
Jan 12 13:37:16 compute-0 groupadd[128396]: group added to /etc/group: name=ceph, GID=167
Jan 12 13:37:16 compute-0 groupadd[128396]: group added to /etc/gshadow: name=ceph
Jan 12 13:37:16 compute-0 groupadd[128396]: new group: name=ceph, GID=167
Jan 12 13:37:16 compute-0 useradd[128402]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 12 13:37:18 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Jan 12 13:37:18 compute-0 sshd[963]: Received signal 15; terminating.
Jan 12 13:37:18 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Jan 12 13:37:18 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Jan 12 13:37:18 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Jan 12 13:37:18 compute-0 systemd[1]: Stopping sshd-keygen.target...
Jan 12 13:37:18 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 12 13:37:18 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 12 13:37:18 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 12 13:37:18 compute-0 systemd[1]: Reached target sshd-keygen.target.
Jan 12 13:37:18 compute-0 systemd[1]: Starting OpenSSH server daemon...
Jan 12 13:37:18 compute-0 sshd[128921]: Server listening on 0.0.0.0 port 22.
Jan 12 13:37:18 compute-0 sshd[128921]: Server listening on :: port 22.
Jan 12 13:37:18 compute-0 systemd[1]: Started OpenSSH server daemon.
Jan 12 13:37:19 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 12 13:37:19 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 12 13:37:19 compute-0 systemd[1]: Reloading.
Jan 12 13:37:19 compute-0 systemd-sysv-generator[129176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:19 compute-0 systemd-rc-local-generator[129172]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:19 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 12 13:37:21 compute-0 sudo[111009]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:22 compute-0 sudo[133546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcxkceeevlcjfqphkzxcldvwcodmyqms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225041.5728273-331-198613969188930/AnsiballZ_systemd.py'
Jan 12 13:37:22 compute-0 sudo[133546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:22 compute-0 python3.9[133568]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 12 13:37:22 compute-0 systemd[1]: Reloading.
Jan 12 13:37:22 compute-0 systemd-rc-local-generator[134076]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:22 compute-0 systemd-sysv-generator[134080]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:22 compute-0 sudo[133546]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:22 compute-0 sudo[134928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myfqecatztpxupquaeyltpkzkrvisruy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225042.6573348-331-141780053450803/AnsiballZ_systemd.py'
Jan 12 13:37:22 compute-0 sudo[134928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:23 compute-0 python3.9[134956]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 12 13:37:23 compute-0 systemd[1]: Reloading.
Jan 12 13:37:23 compute-0 systemd-rc-local-generator[135449]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:23 compute-0 systemd-sysv-generator[135452]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:23 compute-0 sudo[134928]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:23 compute-0 sudo[136135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rceirvvfckzroctiamtkukfvevvyznxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225043.4318-331-84296578342668/AnsiballZ_systemd.py'
Jan 12 13:37:23 compute-0 sudo[136135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:23 compute-0 python3.9[136161]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 12 13:37:23 compute-0 systemd[1]: Reloading.
Jan 12 13:37:23 compute-0 systemd-rc-local-generator[136673]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:23 compute-0 systemd-sysv-generator[136680]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:24 compute-0 sudo[136135]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:24 compute-0 sudo[137444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfrkewvtljsqxkqalkzjiyoawtrrxhzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225044.2006478-331-72991817141858/AnsiballZ_systemd.py'
Jan 12 13:37:24 compute-0 sudo[137444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:24 compute-0 python3.9[137459]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 12 13:37:24 compute-0 systemd[1]: Reloading.
Jan 12 13:37:24 compute-0 systemd-rc-local-generator[137921]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:24 compute-0 systemd-sysv-generator[137924]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:24 compute-0 sudo[137444]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:25 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 12 13:37:25 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 12 13:37:25 compute-0 systemd[1]: man-db-cache-update.service: Consumed 6.914s CPU time.
Jan 12 13:37:25 compute-0 systemd[1]: run-r3ef5f06080954253a1bcdf8955ff7b6d.service: Deactivated successfully.
Jan 12 13:37:25 compute-0 sudo[138468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyeuokqhwcuebravpkkwuyrjmzpsftdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225045.027023-360-89383769509453/AnsiballZ_systemd.py'
Jan 12 13:37:25 compute-0 sudo[138468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:25 compute-0 python3.9[138470]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:25 compute-0 systemd[1]: Reloading.
Jan 12 13:37:25 compute-0 systemd-rc-local-generator[138494]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:25 compute-0 systemd-sysv-generator[138497]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:25 compute-0 sudo[138468]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:25 compute-0 sudo[138658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrjnfipcuprmhazzsjxopuyhvqkursny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225045.7597463-360-271982138930435/AnsiballZ_systemd.py'
Jan 12 13:37:25 compute-0 sudo[138658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:26 compute-0 python3.9[138660]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:26 compute-0 systemd[1]: Reloading.
Jan 12 13:37:26 compute-0 systemd-sysv-generator[138686]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:26 compute-0 systemd-rc-local-generator[138683]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:26 compute-0 sudo[138658]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:26 compute-0 sudo[138848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeysuyjfwajejrwpdjqtcqzehoppqjly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225046.5227258-360-2351070673878/AnsiballZ_systemd.py'
Jan 12 13:37:26 compute-0 sudo[138848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:26 compute-0 python3.9[138850]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:26 compute-0 systemd[1]: Reloading.
Jan 12 13:37:27 compute-0 systemd-rc-local-generator[138874]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:27 compute-0 systemd-sysv-generator[138877]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:27 compute-0 sudo[138848]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:27 compute-0 sudo[139037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zatxudfeqtqhhmvzejzptuddmvqnrwcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225047.2530286-360-232736969132664/AnsiballZ_systemd.py'
Jan 12 13:37:27 compute-0 sudo[139037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:27 compute-0 python3.9[139039]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:27 compute-0 sudo[139037]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:27 compute-0 sudo[139192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykkskbmlpdlozytucylflficccjceomi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225047.788812-360-173586115225511/AnsiballZ_systemd.py'
Jan 12 13:37:27 compute-0 sudo[139192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:28 compute-0 python3.9[139194]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:28 compute-0 systemd[1]: Reloading.
Jan 12 13:37:28 compute-0 systemd-sysv-generator[139224]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:28 compute-0 systemd-rc-local-generator[139221]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:28 compute-0 sudo[139192]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:28 compute-0 sudo[139382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjrwwumyhmnghuglmixlckjordybpjei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225048.5373127-396-139693201520662/AnsiballZ_systemd.py'
Jan 12 13:37:28 compute-0 sudo[139382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:28 compute-0 python3.9[139384]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 12 13:37:28 compute-0 systemd[1]: Reloading.
Jan 12 13:37:29 compute-0 systemd-rc-local-generator[139408]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:37:29 compute-0 systemd-sysv-generator[139411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:37:29 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 12 13:37:29 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 12 13:37:29 compute-0 sudo[139382]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:29 compute-0 sudo[139575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asrdqdvjqojkubqfpvqjvkekyljtyiis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225049.329186-404-2970190759473/AnsiballZ_systemd.py'
Jan 12 13:37:29 compute-0 sudo[139575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:29 compute-0 python3.9[139577]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:29 compute-0 sudo[139575]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:30 compute-0 sudo[139730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgzygttnekputdmxmzxnyfzouukmdtwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225049.861786-404-201479244659598/AnsiballZ_systemd.py'
Jan 12 13:37:30 compute-0 sudo[139730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:30 compute-0 python3.9[139732]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:30 compute-0 sudo[139730]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:30 compute-0 sudo[139885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpkqdyklqobzvtpjkrrohirqmsfikemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225050.407976-404-263488140975652/AnsiballZ_systemd.py'
Jan 12 13:37:30 compute-0 sudo[139885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:30 compute-0 python3.9[139887]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:30 compute-0 sudo[139885]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:31 compute-0 sudo[140040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xereggquimlbbrptcyxazdguzbjeuqfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225050.9631894-404-14832330641062/AnsiballZ_systemd.py'
Jan 12 13:37:31 compute-0 sudo[140040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:31 compute-0 python3.9[140042]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:31 compute-0 sudo[140040]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:31 compute-0 sudo[140203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqakvbsvpnknjglytujksiaxblfbedjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225051.4926963-404-76246027082868/AnsiballZ_systemd.py'
Jan 12 13:37:31 compute-0 sudo[140203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:31 compute-0 podman[140169]: 2026-01-12 13:37:31.729536594 +0000 UTC m=+0.061026876 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 12 13:37:31 compute-0 python3.9[140212]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:31 compute-0 sudo[140203]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:32 compute-0 sudo[140373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjohjqbkxgckcebkjfmcnmbqbwkbtkbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225052.0590353-404-239973342757458/AnsiballZ_systemd.py'
Jan 12 13:37:32 compute-0 sudo[140373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:32 compute-0 python3.9[140375]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:32 compute-0 sudo[140373]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:32 compute-0 sudo[140528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aijpqjkdbmgrfhairidczzsadrpwigmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225052.605906-404-239667536678722/AnsiballZ_systemd.py'
Jan 12 13:37:32 compute-0 sudo[140528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:33 compute-0 python3.9[140530]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:33 compute-0 sudo[140528]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:33 compute-0 sudo[140683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-preqyzzkjajonxjhthfjwvefpemwfwas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225053.1655908-404-239909406525946/AnsiballZ_systemd.py'
Jan 12 13:37:33 compute-0 sudo[140683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:33 compute-0 python3.9[140685]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:33 compute-0 sudo[140683]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:33 compute-0 sudo[140838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwhajucvvlimfikcqvjrxwlrcprmhkru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225053.7046316-404-102806205239856/AnsiballZ_systemd.py'
Jan 12 13:37:33 compute-0 sudo[140838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:34 compute-0 python3.9[140840]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:34 compute-0 sudo[140838]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:34 compute-0 sudo[140993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abjuolglsmteklzosfyxdpbcjiczktrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225054.2405303-404-117920674764726/AnsiballZ_systemd.py'
Jan 12 13:37:34 compute-0 sudo[140993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:34 compute-0 python3.9[140995]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:34 compute-0 sudo[140993]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:34 compute-0 sudo[141148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iumvvwrmpsiymufadlgfqnjzxgdwepwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225054.7791774-404-226438475466013/AnsiballZ_systemd.py'
Jan 12 13:37:34 compute-0 sudo[141148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:35 compute-0 python3.9[141150]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:35 compute-0 sudo[141148]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:35 compute-0 sudo[141303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvaoxlekkoqpbegcfcveguompcaesrfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225055.3254771-404-211069120875203/AnsiballZ_systemd.py'
Jan 12 13:37:35 compute-0 sudo[141303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:35 compute-0 python3.9[141305]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:35 compute-0 sudo[141303]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:36 compute-0 sudo[141458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzmdmojxmklsnjamiushnhhgfnacupvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225055.8652587-404-267847564814308/AnsiballZ_systemd.py'
Jan 12 13:37:36 compute-0 sudo[141458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:36 compute-0 python3.9[141460]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:36 compute-0 sudo[141458]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:36 compute-0 sudo[141613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvjivtomsgwclgtudguylixxqjztaoak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225056.4176576-404-188598442065093/AnsiballZ_systemd.py'
Jan 12 13:37:36 compute-0 sudo[141613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:36 compute-0 python3.9[141615]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 12 13:37:36 compute-0 sudo[141613]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:37 compute-0 sudo[141768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwlkfyzivnxfnjudpwikqjfgkqubvcqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225057.1033726-506-43693968606574/AnsiballZ_file.py'
Jan 12 13:37:37 compute-0 sudo[141768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:37 compute-0 python3.9[141770]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:37:37 compute-0 sudo[141768]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:37 compute-0 sudo[141920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbbmhrnyycttjraxudfhgmklnpljqxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225057.502368-506-105552637436697/AnsiballZ_file.py'
Jan 12 13:37:37 compute-0 sudo[141920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:37 compute-0 python3.9[141922]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:37:37 compute-0 sudo[141920]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:38 compute-0 sudo[142072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlidyizzyobatmhcmyktsyupbpjsraza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225057.924754-506-219231409192/AnsiballZ_file.py'
Jan 12 13:37:38 compute-0 sudo[142072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:38 compute-0 python3.9[142074]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:37:38 compute-0 sudo[142072]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:38 compute-0 sudo[142224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzzbdyfueicewbwnjdtxqwwicalulmgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225058.3293364-506-221438429403920/AnsiballZ_file.py'
Jan 12 13:37:38 compute-0 sudo[142224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:38 compute-0 python3.9[142226]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:37:38 compute-0 sudo[142224]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:38 compute-0 sudo[142376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcqujcysicrotgcvabdiwdrjvuwdddjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225058.7338367-506-135124084743859/AnsiballZ_file.py'
Jan 12 13:37:38 compute-0 sudo[142376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:39 compute-0 python3.9[142378]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:37:39 compute-0 sudo[142376]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:39 compute-0 sudo[142528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apzwvsobewmpuizpsorcsitguhdqmqaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225059.1579573-506-162425133787913/AnsiballZ_file.py'
Jan 12 13:37:39 compute-0 sudo[142528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:39 compute-0 python3.9[142530]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:37:39 compute-0 sudo[142528]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:39 compute-0 sudo[142680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwidjuhtkuhknfnnppwgghvciiilqgku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225059.5878685-549-19147626488649/AnsiballZ_stat.py'
Jan 12 13:37:39 compute-0 sudo[142680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:40 compute-0 python3.9[142682]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:40 compute-0 sudo[142680]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:37:40.190 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:37:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:37:40.191 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:37:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:37:40.191 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:37:40 compute-0 sudo[142805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgjvhkvkptfvykyqalembrcgmunypvzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225059.5878685-549-19147626488649/AnsiballZ_copy.py'
Jan 12 13:37:40 compute-0 sudo[142805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:40 compute-0 python3.9[142807]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768225059.5878685-549-19147626488649/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:40 compute-0 sudo[142805]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:40 compute-0 sudo[142965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmwwysucvaxuxlrgtgetpokopctegbwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225060.6333833-549-154218480193921/AnsiballZ_stat.py'
Jan 12 13:37:40 compute-0 sudo[142965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:40 compute-0 podman[142931]: 2026-01-12 13:37:40.871321287 +0000 UTC m=+0.035533571 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 12 13:37:41 compute-0 python3.9[142974]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:41 compute-0 sudo[142965]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:41 compute-0 sudo[143098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avmrbskwrxgfyivhjmqebyxqiodguiwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225060.6333833-549-154218480193921/AnsiballZ_copy.py'
Jan 12 13:37:41 compute-0 sudo[143098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:41 compute-0 python3.9[143100]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768225060.6333833-549-154218480193921/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:41 compute-0 sudo[143098]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:41 compute-0 sudo[143250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxydkgslfbopavdwmexhussvhmpbtvjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225061.595173-549-100337112979519/AnsiballZ_stat.py'
Jan 12 13:37:41 compute-0 sudo[143250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:41 compute-0 python3.9[143252]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:41 compute-0 sudo[143250]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:42 compute-0 sudo[143375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtyjtnzanghwzcahpegjfawctyzhzdww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225061.595173-549-100337112979519/AnsiballZ_copy.py'
Jan 12 13:37:42 compute-0 sudo[143375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:42 compute-0 python3.9[143377]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768225061.595173-549-100337112979519/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:42 compute-0 sudo[143375]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:42 compute-0 sudo[143527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnfociimhmqwnldxrnyhqpzvhrqhmpfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225062.413655-549-118270940485941/AnsiballZ_stat.py'
Jan 12 13:37:42 compute-0 sudo[143527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:42 compute-0 python3.9[143529]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:42 compute-0 sudo[143527]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:42 compute-0 sudo[143652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbwwwiwpysdfldchgjloozsdkzkjntne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225062.413655-549-118270940485941/AnsiballZ_copy.py'
Jan 12 13:37:42 compute-0 sudo[143652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:43 compute-0 python3.9[143654]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768225062.413655-549-118270940485941/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:43 compute-0 sudo[143652]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:43 compute-0 sudo[143804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzccgkcpmvhzvzmcqhboftiaffaktmhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225063.204089-549-45432803455809/AnsiballZ_stat.py'
Jan 12 13:37:43 compute-0 sudo[143804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:43 compute-0 python3.9[143806]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:43 compute-0 sudo[143804]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:43 compute-0 sudo[143929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guvcctrpophfvvksdgahydguanrdjpxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225063.204089-549-45432803455809/AnsiballZ_copy.py'
Jan 12 13:37:43 compute-0 sudo[143929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:43 compute-0 python3.9[143931]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768225063.204089-549-45432803455809/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:43 compute-0 sudo[143929]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:44 compute-0 sudo[144081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftxhthlahbibukxoaupsrigllderjwql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225063.9978213-549-241091270842777/AnsiballZ_stat.py'
Jan 12 13:37:44 compute-0 sudo[144081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:44 compute-0 python3.9[144083]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:44 compute-0 sudo[144081]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:44 compute-0 sudo[144206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycflrsxfieyjqpxxgaqfewzqfiahmnqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225063.9978213-549-241091270842777/AnsiballZ_copy.py'
Jan 12 13:37:44 compute-0 sudo[144206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:44 compute-0 python3.9[144208]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768225063.9978213-549-241091270842777/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:44 compute-0 sudo[144206]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:44 compute-0 sudo[144358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szkmoohiorvmrmepaasafwtppzlntfke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225064.7740705-549-281394926232445/AnsiballZ_stat.py'
Jan 12 13:37:44 compute-0 sudo[144358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:45 compute-0 python3.9[144360]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:45 compute-0 sudo[144358]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:45 compute-0 sudo[144481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpbtzltsnajjmgkbvlnxgukfvfxefrtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225064.7740705-549-281394926232445/AnsiballZ_copy.py'
Jan 12 13:37:45 compute-0 sudo[144481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:45 compute-0 python3.9[144483]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768225064.7740705-549-281394926232445/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:45 compute-0 sudo[144481]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:45 compute-0 sudo[144633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpwwwlqwyxdoibvavfbhhyvvquhsjpay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225065.5569003-549-266056266921630/AnsiballZ_stat.py'
Jan 12 13:37:45 compute-0 sudo[144633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:45 compute-0 python3.9[144635]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:45 compute-0 sudo[144633]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:46 compute-0 sudo[144758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xraprwegiahpsvxqyjkxiwjrbyrjwrfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225065.5569003-549-266056266921630/AnsiballZ_copy.py'
Jan 12 13:37:46 compute-0 sudo[144758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:46 compute-0 python3.9[144760]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1768225065.5569003-549-266056266921630/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:46 compute-0 sudo[144758]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:46 compute-0 sudo[144910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nehfcpdewmjvppfeezyzgbfghvqxllps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225066.348356-662-12319137806661/AnsiballZ_command.py'
Jan 12 13:37:46 compute-0 sudo[144910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:46 compute-0 python3.9[144912]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 12 13:37:46 compute-0 sudo[144910]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:46 compute-0 sudo[145063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zluiesgvhxauqdflecdlqtigxrwtxloh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225066.811154-671-81804319400076/AnsiballZ_file.py'
Jan 12 13:37:46 compute-0 sudo[145063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:47 compute-0 python3.9[145065]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:47 compute-0 sudo[145063]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:47 compute-0 sudo[145215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjhsclrhctkesmvmtbobtstfdbexgyvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225067.2616408-671-221868625802114/AnsiballZ_file.py'
Jan 12 13:37:47 compute-0 sudo[145215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:47 compute-0 python3.9[145217]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:47 compute-0 sudo[145215]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:47 compute-0 sudo[145367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocbijykjqtbgyyrmxctjgirvteawdiax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225067.6761227-671-262626441825658/AnsiballZ_file.py'
Jan 12 13:37:47 compute-0 sudo[145367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:47 compute-0 python3.9[145369]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:48 compute-0 sudo[145367]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:48 compute-0 sudo[145519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qysxsrgacpwtqiljeddydwxhubtksrmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225068.0860903-671-118899785352968/AnsiballZ_file.py'
Jan 12 13:37:48 compute-0 sudo[145519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:48 compute-0 python3.9[145521]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:48 compute-0 sudo[145519]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:48 compute-0 sudo[145671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldfmoqqcqpmfthaxkguwfjzrcfahagol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225068.5075908-671-175235534038060/AnsiballZ_file.py'
Jan 12 13:37:48 compute-0 sudo[145671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:48 compute-0 python3.9[145673]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:48 compute-0 sudo[145671]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:49 compute-0 sudo[145823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kztxgwgxvsmafgxhkftiwqjpawqqkfky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225068.9231787-671-189830647047757/AnsiballZ_file.py'
Jan 12 13:37:49 compute-0 sudo[145823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:49 compute-0 python3.9[145825]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:49 compute-0 sudo[145823]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:49 compute-0 sudo[145975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqcbnxqehphlncpxqackcrbrltkcbuqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225069.3276677-671-259074092581137/AnsiballZ_file.py'
Jan 12 13:37:49 compute-0 sudo[145975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:49 compute-0 python3.9[145977]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:49 compute-0 sudo[145975]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:49 compute-0 sudo[146127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdnronmxnasiyxuwbxgkrgaitpqxekwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225069.7351968-671-217889344112644/AnsiballZ_file.py'
Jan 12 13:37:49 compute-0 sudo[146127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:50 compute-0 python3.9[146129]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:50 compute-0 sudo[146127]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:50 compute-0 sudo[146279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbwnrmwseubosopwjrktvbrcbolemjbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225070.1539254-671-269155378074138/AnsiballZ_file.py'
Jan 12 13:37:50 compute-0 sudo[146279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:50 compute-0 python3.9[146281]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:50 compute-0 sudo[146279]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:50 compute-0 sudo[146431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikntcyfqhqsghrxftsvcquibmxkbsnjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225070.5593083-671-103179957722006/AnsiballZ_file.py'
Jan 12 13:37:50 compute-0 sudo[146431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:50 compute-0 python3.9[146433]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:50 compute-0 sudo[146431]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:51 compute-0 sudo[146583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lliqsbwrvrxbkmidpcgupdvazgmodvvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225070.9493697-671-143443752259530/AnsiballZ_file.py'
Jan 12 13:37:51 compute-0 sudo[146583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:51 compute-0 python3.9[146585]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:51 compute-0 sudo[146583]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:51 compute-0 sudo[146735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsfwyidupgbicximwftebuokqfabgwcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225071.365012-671-154607754467020/AnsiballZ_file.py'
Jan 12 13:37:51 compute-0 sudo[146735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:51 compute-0 python3.9[146737]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:51 compute-0 sudo[146735]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:51 compute-0 sudo[146887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fycoemxdxmsrjpkyktznltqeypqpufri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225071.7669036-671-280314364415925/AnsiballZ_file.py'
Jan 12 13:37:51 compute-0 sudo[146887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:52 compute-0 python3.9[146889]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:52 compute-0 sudo[146887]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:52 compute-0 sudo[147039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmtweniuitxznjoounxrpmngnbqeaczr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225072.1646104-671-39447254627179/AnsiballZ_file.py'
Jan 12 13:37:52 compute-0 sudo[147039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:52 compute-0 python3.9[147041]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:52 compute-0 sudo[147039]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:52 compute-0 sudo[147191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adsjxhnvcrbmvhwdbfxngltxjcjfizxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225072.6476464-770-104029397869830/AnsiballZ_stat.py'
Jan 12 13:37:52 compute-0 sudo[147191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:52 compute-0 python3.9[147193]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:52 compute-0 sudo[147191]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:53 compute-0 sudo[147314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyqkfhdwnvidvvopymnzfnusvvsvogwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225072.6476464-770-104029397869830/AnsiballZ_copy.py'
Jan 12 13:37:53 compute-0 sudo[147314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:53 compute-0 python3.9[147316]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225072.6476464-770-104029397869830/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:53 compute-0 sudo[147314]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:53 compute-0 sudo[147466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiovvzvoirutrsfndnogairhtauteaun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225073.4872713-770-146021324113452/AnsiballZ_stat.py'
Jan 12 13:37:53 compute-0 sudo[147466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:53 compute-0 python3.9[147468]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:53 compute-0 sudo[147466]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:54 compute-0 sudo[147589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldpazrxmpddccmrggedxazelmbijeijy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225073.4872713-770-146021324113452/AnsiballZ_copy.py'
Jan 12 13:37:54 compute-0 sudo[147589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:54 compute-0 python3.9[147591]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225073.4872713-770-146021324113452/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:54 compute-0 sudo[147589]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:54 compute-0 sudo[147741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etxffxeiiozrzdsrjnmkqqkbgkfiekch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225074.3448675-770-85000202103983/AnsiballZ_stat.py'
Jan 12 13:37:54 compute-0 sudo[147741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:54 compute-0 python3.9[147743]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:54 compute-0 sudo[147741]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:54 compute-0 sudo[147864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvznvvzhnbgieagcunhwgrsrhcmbojvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225074.3448675-770-85000202103983/AnsiballZ_copy.py'
Jan 12 13:37:54 compute-0 sudo[147864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:55 compute-0 python3.9[147866]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225074.3448675-770-85000202103983/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:55 compute-0 sudo[147864]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:55 compute-0 sudo[148016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdfneorpogsaumqjepwmoaxnuigzajju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225075.1253688-770-127321531445219/AnsiballZ_stat.py'
Jan 12 13:37:55 compute-0 sudo[148016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:55 compute-0 python3.9[148018]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:55 compute-0 sudo[148016]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:55 compute-0 sudo[148139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzsaibjfbcarxqdmsyszinwiqjieqrpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225075.1253688-770-127321531445219/AnsiballZ_copy.py'
Jan 12 13:37:55 compute-0 sudo[148139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:55 compute-0 python3.9[148141]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225075.1253688-770-127321531445219/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:55 compute-0 sudo[148139]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:56 compute-0 sudo[148291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilqefobwythlsqsnamynqnuzrzarbzlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225075.9308736-770-139441832975479/AnsiballZ_stat.py'
Jan 12 13:37:56 compute-0 sudo[148291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:56 compute-0 python3.9[148293]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:56 compute-0 sudo[148291]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:56 compute-0 sudo[148414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edigatuywmxphjevccadmcdrnwkrbvjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225075.9308736-770-139441832975479/AnsiballZ_copy.py'
Jan 12 13:37:56 compute-0 sudo[148414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:56 compute-0 python3.9[148416]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225075.9308736-770-139441832975479/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:56 compute-0 sudo[148414]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:56 compute-0 sudo[148566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzxhyuwhgmedhbnrdpvlnldaesddqtdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225076.725179-770-53778113127069/AnsiballZ_stat.py'
Jan 12 13:37:56 compute-0 sudo[148566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:57 compute-0 python3.9[148568]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:57 compute-0 sudo[148566]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:57 compute-0 sudo[148689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ercmxrvttxmpvklcczzuwahrpjvajmdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225076.725179-770-53778113127069/AnsiballZ_copy.py'
Jan 12 13:37:57 compute-0 sudo[148689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:57 compute-0 python3.9[148691]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225076.725179-770-53778113127069/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:57 compute-0 sudo[148689]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:57 compute-0 sudo[148841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liluiwyknoodtnzdhjfxfrhmbesxdlql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225077.5075963-770-10373204613374/AnsiballZ_stat.py'
Jan 12 13:37:57 compute-0 sudo[148841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:57 compute-0 python3.9[148843]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:57 compute-0 sudo[148841]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:58 compute-0 sudo[148964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlphzjlqpcsfgntpzzlmtwmdongerxsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225077.5075963-770-10373204613374/AnsiballZ_copy.py'
Jan 12 13:37:58 compute-0 sudo[148964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:58 compute-0 python3.9[148966]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225077.5075963-770-10373204613374/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:58 compute-0 sudo[148964]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:58 compute-0 sudo[149116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwjbsnbblhinutwhpcifmrsgfautoinr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225078.3283834-770-170920529627711/AnsiballZ_stat.py'
Jan 12 13:37:58 compute-0 sudo[149116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:58 compute-0 python3.9[149118]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:58 compute-0 sudo[149116]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:58 compute-0 sudo[149239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raqegddgkceudydbluzxhtjqwainrxva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225078.3283834-770-170920529627711/AnsiballZ_copy.py'
Jan 12 13:37:58 compute-0 sudo[149239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:59 compute-0 python3.9[149241]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225078.3283834-770-170920529627711/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:59 compute-0 sudo[149239]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:59 compute-0 sudo[149391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efpnkbwfxjkoqhxdxjaezrkgifnwmqeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225079.234133-770-229042701051241/AnsiballZ_stat.py'
Jan 12 13:37:59 compute-0 sudo[149391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:59 compute-0 python3.9[149393]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:37:59 compute-0 sudo[149391]: pam_unix(sudo:session): session closed for user root
Jan 12 13:37:59 compute-0 sudo[149514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqcyvjctuvnbetqjkqzixvjglackervt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225079.234133-770-229042701051241/AnsiballZ_copy.py'
Jan 12 13:37:59 compute-0 sudo[149514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:37:59 compute-0 python3.9[149516]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225079.234133-770-229042701051241/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:37:59 compute-0 sudo[149514]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:00 compute-0 sudo[149666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwgmfydnnjvbhlkxjypwfotemctqhgiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225080.0459807-770-43438723338397/AnsiballZ_stat.py'
Jan 12 13:38:00 compute-0 sudo[149666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:00 compute-0 python3.9[149668]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:00 compute-0 sudo[149666]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:00 compute-0 sudo[149789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgmxddvubdjfaikjuluvvjlejosklcdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225080.0459807-770-43438723338397/AnsiballZ_copy.py'
Jan 12 13:38:00 compute-0 sudo[149789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:00 compute-0 python3.9[149791]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225080.0459807-770-43438723338397/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:00 compute-0 sudo[149789]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:01 compute-0 sudo[149941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcvwalbvgfufgiddejnjcpgkaruqxkru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225080.9375353-770-216396631264317/AnsiballZ_stat.py'
Jan 12 13:38:01 compute-0 sudo[149941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:01 compute-0 python3.9[149943]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:01 compute-0 sudo[149941]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:01 compute-0 sudo[150064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytctnltpzamuxxmlhvkdjffmebdtrewv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225080.9375353-770-216396631264317/AnsiballZ_copy.py'
Jan 12 13:38:01 compute-0 sudo[150064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:01 compute-0 python3.9[150066]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225080.9375353-770-216396631264317/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:01 compute-0 sudo[150064]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:02 compute-0 sudo[150229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zldsvyfbfhnqlirlvwzvcrfbqnhrzhtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225081.8585691-770-40622637957992/AnsiballZ_stat.py'
Jan 12 13:38:02 compute-0 sudo[150229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:02 compute-0 podman[150190]: 2026-01-12 13:38:02.110783029 +0000 UTC m=+0.065326960 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:38:02 compute-0 python3.9[150236]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:02 compute-0 sudo[150229]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:02 compute-0 sudo[150363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhrgzarapplkqwrbhwcckvcqnnpxmwfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225081.8585691-770-40622637957992/AnsiballZ_copy.py'
Jan 12 13:38:02 compute-0 sudo[150363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:02 compute-0 python3.9[150365]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225081.8585691-770-40622637957992/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:02 compute-0 sudo[150363]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:03 compute-0 sudo[150515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmvreocsnafvffruoecqndfsktxoxlcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225082.8546004-770-88280096825209/AnsiballZ_stat.py'
Jan 12 13:38:03 compute-0 sudo[150515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:03 compute-0 python3.9[150517]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:03 compute-0 sudo[150515]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:03 compute-0 sudo[150638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdlzmvdntizdgxlgdhtdqomqctfbogbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225082.8546004-770-88280096825209/AnsiballZ_copy.py'
Jan 12 13:38:03 compute-0 sudo[150638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:03 compute-0 python3.9[150640]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225082.8546004-770-88280096825209/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:03 compute-0 sudo[150638]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:03 compute-0 sudo[150790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovpnshunmgwvpbkaoqzvguaxehhcvhbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225083.7153764-770-205399270004991/AnsiballZ_stat.py'
Jan 12 13:38:03 compute-0 sudo[150790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:04 compute-0 python3.9[150792]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:04 compute-0 sudo[150790]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:04 compute-0 sudo[150913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpcrgjxwjlhjeqwfsexppnlbmnnencdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225083.7153764-770-205399270004991/AnsiballZ_copy.py'
Jan 12 13:38:04 compute-0 sudo[150913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:04 compute-0 python3.9[150915]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225083.7153764-770-205399270004991/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:04 compute-0 sudo[150913]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:04 compute-0 python3.9[151065]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:38:05 compute-0 sudo[151218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztlihczjtgblogcaletylmxraznopnln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225085.0410419-976-92245495347185/AnsiballZ_seboolean.py'
Jan 12 13:38:05 compute-0 sudo[151218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:05 compute-0 python3.9[151220]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 12 13:38:06 compute-0 sudo[151218]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:06 compute-0 sudo[151374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzqesrurvggzmijglzyehqjgslzgrotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225086.3753834-984-32587296140903/AnsiballZ_copy.py'
Jan 12 13:38:06 compute-0 dbus-broker-launch[766]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 12 13:38:06 compute-0 sudo[151374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:06 compute-0 python3.9[151376]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:06 compute-0 sudo[151374]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:06 compute-0 sudo[151526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stsodbtmjgeyulcfzigcwvlvvbsqofca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225086.819729-984-230570207355815/AnsiballZ_copy.py'
Jan 12 13:38:06 compute-0 sudo[151526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:07 compute-0 python3.9[151528]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:07 compute-0 sudo[151526]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:07 compute-0 sudo[151678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjasvpeopmwximdeuqqjgesvuplkvynm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225087.373051-984-12094265227902/AnsiballZ_copy.py'
Jan 12 13:38:07 compute-0 sudo[151678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:07 compute-0 python3.9[151680]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:07 compute-0 sudo[151678]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:07 compute-0 sudo[151830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwlandmudkmztjeuavboxmiyjbfotamg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225087.7976801-984-231824932736370/AnsiballZ_copy.py'
Jan 12 13:38:07 compute-0 sudo[151830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:08 compute-0 python3.9[151832]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:08 compute-0 sudo[151830]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:08 compute-0 sudo[151982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-silweamlwuoqwgxojgvhzhvwjasvaawg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225088.2314758-984-218331737098527/AnsiballZ_copy.py'
Jan 12 13:38:08 compute-0 sudo[151982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:08 compute-0 python3.9[151984]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:08 compute-0 sudo[151982]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:08 compute-0 sudo[152134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocrrfueenwbzwyyjjftcweewrmvybvep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225088.7040153-1020-225697676523479/AnsiballZ_copy.py'
Jan 12 13:38:08 compute-0 sudo[152134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:09 compute-0 python3.9[152136]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:09 compute-0 sudo[152134]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:09 compute-0 sudo[152286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cejbgrzxwzcqjiwbznqgsmfkpviyraya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225089.1392186-1020-94005969312525/AnsiballZ_copy.py'
Jan 12 13:38:09 compute-0 sudo[152286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:09 compute-0 python3.9[152288]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:09 compute-0 sudo[152286]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:09 compute-0 sudo[152438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvobksziuxqbjdqkseizfohwentbivah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225089.5669236-1020-139536343475921/AnsiballZ_copy.py'
Jan 12 13:38:09 compute-0 sudo[152438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:09 compute-0 python3.9[152440]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:09 compute-0 sudo[152438]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:10 compute-0 sudo[152590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erkniiyvqhkunmmpolxzvprydnpmtgdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225090.007625-1020-16155218082066/AnsiballZ_copy.py'
Jan 12 13:38:10 compute-0 sudo[152590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:10 compute-0 python3.9[152592]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:10 compute-0 sudo[152590]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:10 compute-0 sudo[152742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxjtipmixvvthvnckvcfnexyqksrgsjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225090.44035-1020-210051981617608/AnsiballZ_copy.py'
Jan 12 13:38:10 compute-0 sudo[152742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:10 compute-0 python3.9[152744]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:10 compute-0 sudo[152742]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:11 compute-0 sudo[152903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuentmpjdwtcymctufkzyfxfxepworbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225090.9353004-1056-262809271586854/AnsiballZ_systemd.py'
Jan 12 13:38:11 compute-0 sudo[152903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:11 compute-0 podman[152868]: 2026-01-12 13:38:11.16291249 +0000 UTC m=+0.043635473 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 12 13:38:11 compute-0 python3.9[152910]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:38:11 compute-0 systemd[1]: Reloading.
Jan 12 13:38:11 compute-0 systemd-rc-local-generator[152931]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:11 compute-0 systemd-sysv-generator[152936]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:11 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Jan 12 13:38:11 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Jan 12 13:38:11 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 12 13:38:11 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 12 13:38:11 compute-0 systemd[1]: Starting libvirt logging daemon...
Jan 12 13:38:11 compute-0 systemd[1]: Started libvirt logging daemon.
Jan 12 13:38:11 compute-0 sudo[152903]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:11 compute-0 sudo[153103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mryejlyglplgblwyvatrygdqsnqrchfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225091.7800636-1056-105231273810836/AnsiballZ_systemd.py'
Jan 12 13:38:11 compute-0 sudo[153103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:12 compute-0 python3.9[153105]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:38:12 compute-0 systemd[1]: Reloading.
Jan 12 13:38:12 compute-0 systemd-rc-local-generator[153126]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:12 compute-0 systemd-sysv-generator[153132]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:12 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 12 13:38:12 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 12 13:38:12 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 12 13:38:12 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 12 13:38:12 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 12 13:38:12 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 12 13:38:12 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 12 13:38:12 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 12 13:38:12 compute-0 sudo[153103]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:12 compute-0 sudo[153319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fblahmibqnpgszgdyfdbikneuuhdjsqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225092.6116354-1056-114423923724012/AnsiballZ_systemd.py'
Jan 12 13:38:12 compute-0 sudo[153319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:13 compute-0 python3.9[153321]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:38:13 compute-0 systemd[1]: Reloading.
Jan 12 13:38:13 compute-0 systemd-sysv-generator[153345]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:13 compute-0 systemd-rc-local-generator[153342]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:13 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 12 13:38:13 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 12 13:38:13 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 12 13:38:13 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 12 13:38:13 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 12 13:38:13 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 12 13:38:13 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 12 13:38:13 compute-0 sudo[153319]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:13 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 12 13:38:13 compute-0 sudo[153534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itwczkqosojcwygiddkhyejkoncstyvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225093.4168034-1056-239940988217734/AnsiballZ_systemd.py'
Jan 12 13:38:13 compute-0 sudo[153534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:13 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 12 13:38:13 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 12 13:38:13 compute-0 python3.9[153537]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:38:13 compute-0 systemd[1]: Reloading.
Jan 12 13:38:13 compute-0 systemd-rc-local-generator[153562]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:13 compute-0 systemd-sysv-generator[153565]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:14 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Jan 12 13:38:14 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 12 13:38:14 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 12 13:38:14 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 12 13:38:14 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 12 13:38:14 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 12 13:38:14 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 12 13:38:14 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 12 13:38:14 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 12 13:38:14 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 12 13:38:14 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 12 13:38:14 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 12 13:38:14 compute-0 sudo[153534]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:14 compute-0 setroubleshoot[153358]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 08e6bd2e-20fd-4693-b891-df2413a9e9d9
Jan 12 13:38:14 compute-0 setroubleshoot[153358]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 12 13:38:14 compute-0 setroubleshoot[153358]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 08e6bd2e-20fd-4693-b891-df2413a9e9d9
Jan 12 13:38:14 compute-0 setroubleshoot[153358]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 12 13:38:14 compute-0 sudo[153755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndmudxwxfdllysdnkycwyobhtsveuyst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225094.250178-1056-31451759339897/AnsiballZ_systemd.py'
Jan 12 13:38:14 compute-0 sudo[153755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:14 compute-0 python3.9[153757]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:38:14 compute-0 systemd[1]: Reloading.
Jan 12 13:38:14 compute-0 systemd-rc-local-generator[153775]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:14 compute-0 systemd-sysv-generator[153780]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:14 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Jan 12 13:38:14 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Jan 12 13:38:14 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 12 13:38:14 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 12 13:38:14 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 12 13:38:14 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 12 13:38:14 compute-0 systemd[1]: Starting libvirt secret daemon...
Jan 12 13:38:14 compute-0 systemd[1]: Started libvirt secret daemon.
Jan 12 13:38:14 compute-0 sudo[153755]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:15 compute-0 sudo[153967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvjnhkahnhdduvdekdqfekwmkhbuscxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225095.1020527-1093-258785450019518/AnsiballZ_file.py'
Jan 12 13:38:15 compute-0 sudo[153967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:15 compute-0 python3.9[153969]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:15 compute-0 sudo[153967]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:15 compute-0 sudo[154119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfpcvszthjlyizowizctzzzetrsmsqpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225095.5408626-1101-248139870267365/AnsiballZ_find.py'
Jan 12 13:38:15 compute-0 sudo[154119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:15 compute-0 python3.9[154121]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 12 13:38:15 compute-0 sudo[154119]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:16 compute-0 sudo[154271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qflcjdvfqcodoopwgkekdzcctyjpdzjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225096.2397768-1115-243764595960889/AnsiballZ_stat.py'
Jan 12 13:38:16 compute-0 sudo[154271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:16 compute-0 python3.9[154273]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:16 compute-0 sudo[154271]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:16 compute-0 sudo[154394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bchxmvbfkteansdzhjzdzndsfxyalrme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225096.2397768-1115-243764595960889/AnsiballZ_copy.py'
Jan 12 13:38:16 compute-0 sudo[154394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:16 compute-0 python3.9[154396]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225096.2397768-1115-243764595960889/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:16 compute-0 sudo[154394]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:17 compute-0 sudo[154546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqwrrlsymunvjylkeihtlrrylsfklsyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225097.1220005-1131-176929626548778/AnsiballZ_file.py'
Jan 12 13:38:17 compute-0 sudo[154546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:17 compute-0 python3.9[154548]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:17 compute-0 sudo[154546]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:17 compute-0 sudo[154698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kloomlvfekdegzblxzgysldhmgvjdfkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225097.5553057-1139-178047003695083/AnsiballZ_stat.py'
Jan 12 13:38:17 compute-0 sudo[154698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:17 compute-0 python3.9[154700]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:17 compute-0 sudo[154698]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:18 compute-0 sudo[154776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yknejyyupeddeweurssfgbfrgbxstgwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225097.5553057-1139-178047003695083/AnsiballZ_file.py'
Jan 12 13:38:18 compute-0 sudo[154776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:18 compute-0 python3.9[154778]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:18 compute-0 sudo[154776]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:18 compute-0 sudo[154928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqrxzxjkpngrnnmmvqwpniwtefoaqfjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225098.2982116-1151-190298634678023/AnsiballZ_stat.py'
Jan 12 13:38:18 compute-0 sudo[154928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:18 compute-0 python3.9[154930]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:18 compute-0 sudo[154928]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:18 compute-0 sudo[155006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pugpayvaqqbnzhzmzwosxbwibiheotvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225098.2982116-1151-190298634678023/AnsiballZ_file.py'
Jan 12 13:38:18 compute-0 sudo[155006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:18 compute-0 python3.9[155008]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.x2h_q1oh recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:18 compute-0 sudo[155006]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:19 compute-0 sudo[155158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slxnqwtumpglljpjaodwmekdkdyiovzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225099.0168004-1163-84511851589059/AnsiballZ_stat.py'
Jan 12 13:38:19 compute-0 sudo[155158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:19 compute-0 python3.9[155160]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:19 compute-0 sudo[155158]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:19 compute-0 sudo[155236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjicdszojlhpbgbszefrichqahlatbmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225099.0168004-1163-84511851589059/AnsiballZ_file.py'
Jan 12 13:38:19 compute-0 sudo[155236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:19 compute-0 python3.9[155238]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:19 compute-0 sudo[155236]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:19 compute-0 sudo[155388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxtkvaosmqvbczsbtrunfcupthmpbuvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225099.7500236-1176-114454695614543/AnsiballZ_command.py'
Jan 12 13:38:19 compute-0 sudo[155388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:20 compute-0 python3.9[155390]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:38:20 compute-0 sudo[155388]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:20 compute-0 sudo[155541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgqdmgkomwnthlpikbgvczksfkvvlejz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768225100.1826937-1184-114096342571234/AnsiballZ_edpm_nftables_from_files.py'
Jan 12 13:38:20 compute-0 sudo[155541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:20 compute-0 python3[155543]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 12 13:38:20 compute-0 sudo[155541]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:21 compute-0 sudo[155693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpbollgpkiynymgurwrhwxbmseiotmin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225100.8584814-1192-141080572572184/AnsiballZ_stat.py'
Jan 12 13:38:21 compute-0 sudo[155693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:21 compute-0 python3.9[155695]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:21 compute-0 sudo[155693]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:21 compute-0 sudo[155771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtngjwphrymppckniwpwmxgogezakjig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225100.8584814-1192-141080572572184/AnsiballZ_file.py'
Jan 12 13:38:21 compute-0 sudo[155771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:21 compute-0 python3.9[155773]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:21 compute-0 sudo[155771]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:21 compute-0 sudo[155923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pekfkemevbgktpfzrxucedvvranwxwwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225101.6519933-1204-87952217467712/AnsiballZ_stat.py'
Jan 12 13:38:21 compute-0 sudo[155923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:21 compute-0 python3.9[155925]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:22 compute-0 sudo[155923]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:22 compute-0 sudo[156001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmstiqkxjwqqimevmjhybpeghdflnuzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225101.6519933-1204-87952217467712/AnsiballZ_file.py'
Jan 12 13:38:22 compute-0 sudo[156001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:22 compute-0 python3.9[156003]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:22 compute-0 sudo[156001]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:22 compute-0 sudo[156153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jowkviatacgdcypgelyktwxhzncmgvwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225102.4320679-1216-183210447362443/AnsiballZ_stat.py'
Jan 12 13:38:22 compute-0 sudo[156153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:22 compute-0 python3.9[156155]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:22 compute-0 sudo[156153]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:22 compute-0 sudo[156231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdptfakswfetrmiioycdoddibnhmcvtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225102.4320679-1216-183210447362443/AnsiballZ_file.py'
Jan 12 13:38:22 compute-0 sudo[156231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:23 compute-0 python3.9[156233]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:23 compute-0 sudo[156231]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:23 compute-0 sudo[156383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrvytmlmimhnbqvykklclgoaanpflmha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225103.1626158-1228-36503037515318/AnsiballZ_stat.py'
Jan 12 13:38:23 compute-0 sudo[156383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:23 compute-0 python3.9[156385]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:23 compute-0 sudo[156383]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:23 compute-0 sudo[156461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snvodwcfulxsejndrgssmnerfquvjzey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225103.1626158-1228-36503037515318/AnsiballZ_file.py'
Jan 12 13:38:23 compute-0 sudo[156461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:23 compute-0 python3.9[156463]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:23 compute-0 sudo[156461]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:24 compute-0 sudo[156613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbtktvxaryerzprauxuldxjmjjxucnse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225103.9836977-1240-15420700136467/AnsiballZ_stat.py'
Jan 12 13:38:24 compute-0 sudo[156613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:24 compute-0 python3.9[156615]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:24 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 12 13:38:24 compute-0 sudo[156613]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:24 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 12 13:38:24 compute-0 sudo[156738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gubhokwraldwwtorefqwyfnpydtwtsot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225103.9836977-1240-15420700136467/AnsiballZ_copy.py'
Jan 12 13:38:24 compute-0 sudo[156738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:24 compute-0 python3.9[156740]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225103.9836977-1240-15420700136467/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:24 compute-0 sudo[156738]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:25 compute-0 sudo[156890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhjkrwdpeyspddvwuaiqiwrfvsmtqzmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225104.9467356-1255-58032329763314/AnsiballZ_file.py'
Jan 12 13:38:25 compute-0 sudo[156890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:25 compute-0 python3.9[156892]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:25 compute-0 sudo[156890]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:25 compute-0 sudo[157042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtwyzanomydrckzdhdbsaqcscautuqfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225105.4041717-1263-198973382271066/AnsiballZ_command.py'
Jan 12 13:38:25 compute-0 sudo[157042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:25 compute-0 python3.9[157044]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:38:25 compute-0 sudo[157042]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:26 compute-0 sudo[157197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpuwlveupvmypeswjwcljrztahfgkgvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225105.8724039-1271-108552636928137/AnsiballZ_blockinfile.py'
Jan 12 13:38:26 compute-0 sudo[157197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:26 compute-0 python3.9[157199]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:26 compute-0 sudo[157197]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:26 compute-0 sudo[157349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjvpaowzxyxcelewiyzefcewqpeikhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225106.4799185-1280-151321087161647/AnsiballZ_command.py'
Jan 12 13:38:26 compute-0 sudo[157349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:26 compute-0 python3.9[157351]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:38:26 compute-0 sudo[157349]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:27 compute-0 sudo[157502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcgfornuwguzxkedvwtirjwjhplfijyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225106.9296718-1288-61079276460548/AnsiballZ_stat.py'
Jan 12 13:38:27 compute-0 sudo[157502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:27 compute-0 python3.9[157504]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:38:27 compute-0 sudo[157502]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:27 compute-0 sudo[157656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehuihnrmhgrbipladsruloxkakpypfoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225107.3715105-1296-182429104931966/AnsiballZ_command.py'
Jan 12 13:38:27 compute-0 sudo[157656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:27 compute-0 python3.9[157658]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:38:27 compute-0 sudo[157656]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:27 compute-0 sudo[157811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srsfswtcaekygcrqxlshupycofgsiknb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225107.817031-1304-221648480380375/AnsiballZ_file.py'
Jan 12 13:38:27 compute-0 sudo[157811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:28 compute-0 python3.9[157813]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:28 compute-0 sudo[157811]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:28 compute-0 sudo[157963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efbsmljqayuxhasfwudwsaactjhucnpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225108.2465062-1312-115160452610858/AnsiballZ_stat.py'
Jan 12 13:38:28 compute-0 sudo[157963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:28 compute-0 python3.9[157965]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:28 compute-0 sudo[157963]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:28 compute-0 sudo[158086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbmlokmodmasclzvlturxoqzhddortyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225108.2465062-1312-115160452610858/AnsiballZ_copy.py'
Jan 12 13:38:28 compute-0 sudo[158086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:28 compute-0 python3.9[158088]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225108.2465062-1312-115160452610858/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:28 compute-0 sudo[158086]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:29 compute-0 sudo[158238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvxkgsnfimtoqtgxiluqlfkapklbmokp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225109.028394-1327-68175843803313/AnsiballZ_stat.py'
Jan 12 13:38:29 compute-0 sudo[158238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:29 compute-0 python3.9[158240]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:29 compute-0 sudo[158238]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:29 compute-0 sudo[158361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycwntqtnmiwtfmhupcplsrcopxxiznev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225109.028394-1327-68175843803313/AnsiballZ_copy.py'
Jan 12 13:38:29 compute-0 sudo[158361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:29 compute-0 python3.9[158363]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225109.028394-1327-68175843803313/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:29 compute-0 sudo[158361]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:29 compute-0 sudo[158513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cknywoeslmqbybcxwjlucxdfxzputenc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225109.8177965-1342-232376639956878/AnsiballZ_stat.py'
Jan 12 13:38:29 compute-0 sudo[158513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:30 compute-0 python3.9[158515]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:30 compute-0 sudo[158513]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:30 compute-0 sudo[158636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mujjlguypfbpcynzwrgtlspwpczkvhyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225109.8177965-1342-232376639956878/AnsiballZ_copy.py'
Jan 12 13:38:30 compute-0 sudo[158636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:30 compute-0 python3.9[158638]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225109.8177965-1342-232376639956878/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:30 compute-0 sudo[158636]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:30 compute-0 sudo[158788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkqtvvesibnsdmfrfwqaaynjmyqiisti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225110.6118803-1357-127886484188785/AnsiballZ_systemd.py'
Jan 12 13:38:30 compute-0 sudo[158788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:31 compute-0 python3.9[158790]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:38:31 compute-0 systemd[1]: Reloading.
Jan 12 13:38:31 compute-0 systemd-sysv-generator[158813]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:31 compute-0 systemd-rc-local-generator[158810]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:31 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Jan 12 13:38:31 compute-0 sudo[158788]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:31 compute-0 sudo[158979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taihjjnqlahzzwtprpwmgoeimrbiyluw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225111.351135-1365-179576247249226/AnsiballZ_systemd.py'
Jan 12 13:38:31 compute-0 sudo[158979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:31 compute-0 python3.9[158981]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 12 13:38:31 compute-0 systemd[1]: Reloading.
Jan 12 13:38:31 compute-0 systemd-rc-local-generator[159001]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:31 compute-0 systemd-sysv-generator[159004]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:31 compute-0 systemd[1]: Reloading.
Jan 12 13:38:32 compute-0 systemd-rc-local-generator[159039]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:32 compute-0 systemd-sysv-generator[159042]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:32 compute-0 sudo[158979]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:32 compute-0 podman[159054]: 2026-01-12 13:38:32.249538013 +0000 UTC m=+0.058794120 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Jan 12 13:38:32 compute-0 sshd-session[104731]: Connection closed by 192.168.122.30 port 58208
Jan 12 13:38:32 compute-0 sshd-session[104728]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:38:32 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Jan 12 13:38:32 compute-0 systemd[1]: session-22.scope: Consumed 2min 15.914s CPU time.
Jan 12 13:38:32 compute-0 systemd-logind[775]: Session 22 logged out. Waiting for processes to exit.
Jan 12 13:38:32 compute-0 systemd-logind[775]: Removed session 22.
Jan 12 13:38:38 compute-0 sshd-session[159101]: Accepted publickey for zuul from 192.168.122.30 port 43470 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:38:38 compute-0 systemd-logind[775]: New session 23 of user zuul.
Jan 12 13:38:38 compute-0 systemd[1]: Started Session 23 of User zuul.
Jan 12 13:38:38 compute-0 sshd-session[159101]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:38:38 compute-0 python3.9[159254]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:38:39 compute-0 python3.9[159408]: ansible-ansible.builtin.service_facts Invoked
Jan 12 13:38:39 compute-0 network[159425]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 12 13:38:39 compute-0 network[159426]: 'network-scripts' will be removed from distribution in near future.
Jan 12 13:38:39 compute-0 network[159427]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 12 13:38:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:38:40.191 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:38:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:38:40.192 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:38:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:38:40.192 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:38:41 compute-0 podman[159529]: 2026-01-12 13:38:41.23738785 +0000 UTC m=+0.049253082 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:38:41 compute-0 sudo[159712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsaaktkwejooljcepgpreyqbzxvxwpaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225121.5985548-42-64519062319925/AnsiballZ_setup.py'
Jan 12 13:38:41 compute-0 sudo[159712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:42 compute-0 python3.9[159714]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 12 13:38:42 compute-0 sudo[159712]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:42 compute-0 sudo[159796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxrqhlklnjneryylgngvmnzekpnimrxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225121.5985548-42-64519062319925/AnsiballZ_dnf.py'
Jan 12 13:38:42 compute-0 sudo[159796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:42 compute-0 python3.9[159798]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:38:46 compute-0 sudo[159796]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:46 compute-0 sudo[159949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnsbtmbbwxaubeclauyikutyjqgqaojy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225126.6820123-54-270868506336147/AnsiballZ_stat.py'
Jan 12 13:38:46 compute-0 sudo[159949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:47 compute-0 python3.9[159951]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:38:47 compute-0 sudo[159949]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:47 compute-0 sudo[160101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqojlrsvfvpffmwnxbrhrcakcmvxmchs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225127.2440426-64-9571015931876/AnsiballZ_command.py'
Jan 12 13:38:47 compute-0 sudo[160101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:47 compute-0 python3.9[160103]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:38:47 compute-0 sudo[160101]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:47 compute-0 sudo[160254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdopaydokadjtbxpcmzyzlnbkbthbtqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225127.8410888-74-262145488099415/AnsiballZ_stat.py'
Jan 12 13:38:48 compute-0 sudo[160254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:48 compute-0 python3.9[160256]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:38:48 compute-0 sudo[160254]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:48 compute-0 sudo[160406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlsbksktfcguyrxqjrgkpdiehpnvfgid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225128.2636435-82-134186602402659/AnsiballZ_command.py'
Jan 12 13:38:48 compute-0 sudo[160406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:48 compute-0 python3.9[160408]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:38:48 compute-0 sudo[160406]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:48 compute-0 sudo[160559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmrehuqrmzurzkzkrmserctculsaszdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225128.6877272-90-80670741362039/AnsiballZ_stat.py'
Jan 12 13:38:48 compute-0 sudo[160559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:48 compute-0 python3.9[160561]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:49 compute-0 sudo[160559]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:49 compute-0 sudo[160682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viwlfqlmwugwspkbeszgsbpssnlckpqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225128.6877272-90-80670741362039/AnsiballZ_copy.py'
Jan 12 13:38:49 compute-0 sudo[160682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:49 compute-0 python3.9[160684]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225128.6877272-90-80670741362039/.source.iscsi _original_basename=.kydw8qd3 follow=False checksum=177e3dfcb3ec64c4838ff5ea4e09442852308429 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:49 compute-0 sudo[160682]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:49 compute-0 sudo[160834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtwtpgoejxxxgtizefqbwobwwbagejal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225129.592341-105-238241379129283/AnsiballZ_file.py'
Jan 12 13:38:49 compute-0 sudo[160834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:50 compute-0 python3.9[160836]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:50 compute-0 sudo[160834]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:50 compute-0 sudo[160986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djuclvohddoqgcltabkivyqnfuyndhzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225130.1517048-113-61052686614698/AnsiballZ_lineinfile.py'
Jan 12 13:38:50 compute-0 sudo[160986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:50 compute-0 python3.9[160988]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:50 compute-0 sudo[160986]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:51 compute-0 sudo[161138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewzwtajaytrzjxvztoygqximduxsslts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225130.7342715-122-238349803314698/AnsiballZ_systemd_service.py'
Jan 12 13:38:51 compute-0 sudo[161138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:51 compute-0 python3.9[161140]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:38:51 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 12 13:38:51 compute-0 sudo[161138]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:51 compute-0 sudo[161294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctyzficwprgsyphwogtlttirkoxyvkpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225131.5636055-130-275768976841291/AnsiballZ_systemd_service.py'
Jan 12 13:38:51 compute-0 sudo[161294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:52 compute-0 python3.9[161296]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:38:52 compute-0 systemd[1]: Reloading.
Jan 12 13:38:52 compute-0 systemd-sysv-generator[161322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:52 compute-0 systemd-rc-local-generator[161318]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:52 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 12 13:38:52 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 12 13:38:52 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Jan 12 13:38:52 compute-0 systemd[1]: Started Open-iSCSI.
Jan 12 13:38:52 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 12 13:38:52 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 12 13:38:52 compute-0 sudo[161294]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:52 compute-0 python3.9[161496]: ansible-ansible.builtin.service_facts Invoked
Jan 12 13:38:52 compute-0 network[161513]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 12 13:38:52 compute-0 network[161514]: 'network-scripts' will be removed from distribution in near future.
Jan 12 13:38:52 compute-0 network[161515]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 12 13:38:55 compute-0 sudo[161784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xijvspwpkntpguskyvhcpjpphbncunlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225135.0976157-153-140735472656165/AnsiballZ_dnf.py'
Jan 12 13:38:55 compute-0 sudo[161784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:55 compute-0 python3.9[161786]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:38:57 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 12 13:38:57 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 12 13:38:57 compute-0 systemd[1]: Reloading.
Jan 12 13:38:57 compute-0 systemd-sysv-generator[161835]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:38:57 compute-0 systemd-rc-local-generator[161831]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:38:57 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 12 13:38:57 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 12 13:38:57 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 12 13:38:57 compute-0 systemd[1]: run-r7127ca4ab2a24859a646541e050e74b4.service: Deactivated successfully.
Jan 12 13:38:57 compute-0 sudo[161784]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:58 compute-0 sudo[162100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdqdbpfokgybuwagpxuxvppwtoqiistp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225138.1044202-162-8840492003617/AnsiballZ_file.py'
Jan 12 13:38:58 compute-0 sudo[162100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:58 compute-0 python3.9[162102]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 12 13:38:58 compute-0 sudo[162100]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:58 compute-0 sudo[162252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pheyowgrvqpjsskbihzkbffwpfvhewym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225138.571517-170-257420171516173/AnsiballZ_modprobe.py'
Jan 12 13:38:58 compute-0 sudo[162252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:59 compute-0 python3.9[162254]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 12 13:38:59 compute-0 sudo[162252]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:59 compute-0 sudo[162408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxojxyilgnmiikzqizkotzpfocmpzmyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225139.1677787-178-148110746508871/AnsiballZ_stat.py'
Jan 12 13:38:59 compute-0 sudo[162408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:59 compute-0 python3.9[162410]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:38:59 compute-0 sudo[162408]: pam_unix(sudo:session): session closed for user root
Jan 12 13:38:59 compute-0 sudo[162531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swlerbsbrrvwdjmiqigowlcftcyqzoav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225139.1677787-178-148110746508871/AnsiballZ_copy.py'
Jan 12 13:38:59 compute-0 sudo[162531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:38:59 compute-0 python3.9[162533]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225139.1677787-178-148110746508871/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:38:59 compute-0 sudo[162531]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:00 compute-0 sudo[162683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvfznqusxepgwfgblgxylaglgodfjwpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225140.078705-194-258159381109105/AnsiballZ_lineinfile.py'
Jan 12 13:39:00 compute-0 sudo[162683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:00 compute-0 python3.9[162685]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:00 compute-0 sudo[162683]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:01 compute-0 sudo[162835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ertnqhodkothmqrmygculjkwgfayhiac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225140.5293772-202-223560319277401/AnsiballZ_systemd.py'
Jan 12 13:39:01 compute-0 sudo[162835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:01 compute-0 python3.9[162837]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:39:01 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 12 13:39:01 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 12 13:39:01 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 12 13:39:01 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 12 13:39:01 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 12 13:39:01 compute-0 sudo[162835]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:01 compute-0 sudo[162991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrgqtpdxihlvorkxwakspjzgklwnxnrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225141.4326942-210-88863151661223/AnsiballZ_command.py'
Jan 12 13:39:01 compute-0 sudo[162991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:01 compute-0 python3.9[162993]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:01 compute-0 sudo[162991]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:02 compute-0 sudo[163144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddftiyctbkqgjpyppchuunecjznvovhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225141.9613154-220-157179383966878/AnsiballZ_stat.py'
Jan 12 13:39:02 compute-0 sudo[163144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:02 compute-0 python3.9[163146]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:39:02 compute-0 sudo[163144]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:02 compute-0 podman[163223]: 2026-01-12 13:39:02.582756556 +0000 UTC m=+0.080129335 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:39:02 compute-0 sudo[163319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbswltnwisxsvxgvxbilmaslukibvrut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225142.4446304-229-146957904855718/AnsiballZ_stat.py'
Jan 12 13:39:02 compute-0 sudo[163319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:02 compute-0 python3.9[163321]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:02 compute-0 sudo[163319]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:03 compute-0 sudo[163442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivkcsfebbyjmcvxhqeklvmvxnuyqwcap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225142.4446304-229-146957904855718/AnsiballZ_copy.py'
Jan 12 13:39:03 compute-0 sudo[163442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:03 compute-0 python3.9[163444]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225142.4446304-229-146957904855718/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:03 compute-0 sudo[163442]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:03 compute-0 sudo[163594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzafrxijlwkhafefspwbvlugwghjcjlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225143.2777417-244-171432903424213/AnsiballZ_command.py'
Jan 12 13:39:03 compute-0 sudo[163594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:03 compute-0 python3.9[163596]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:03 compute-0 sudo[163594]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:03 compute-0 sudo[163747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kobcjksbbgqsfcbhgtontjqnmogzgatm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225143.7449942-252-71439018417929/AnsiballZ_lineinfile.py'
Jan 12 13:39:03 compute-0 sudo[163747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:04 compute-0 python3.9[163749]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:04 compute-0 sudo[163747]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:04 compute-0 sudo[163899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfjkploguohacvmjbtavdwyaiizacghc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225144.2277095-260-85777620983092/AnsiballZ_replace.py'
Jan 12 13:39:04 compute-0 sudo[163899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:04 compute-0 python3.9[163901]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:04 compute-0 sudo[163899]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:05 compute-0 sudo[164051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmvrypfgdycyxhfbgczrroxpcxjzsadt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225144.838624-268-51523816031837/AnsiballZ_replace.py'
Jan 12 13:39:05 compute-0 sudo[164051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:05 compute-0 python3.9[164053]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:05 compute-0 sudo[164051]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:05 compute-0 sudo[164203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdsbvrozexnlaisegefuvuylvnlljzpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225145.3172314-277-140859543586515/AnsiballZ_lineinfile.py'
Jan 12 13:39:05 compute-0 sudo[164203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:05 compute-0 python3.9[164205]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:05 compute-0 sudo[164203]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:05 compute-0 sudo[164355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olltgygbqrbmvqmnhpeyxktlntiysgrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225145.7421882-277-112768549786182/AnsiballZ_lineinfile.py'
Jan 12 13:39:05 compute-0 sudo[164355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:06 compute-0 python3.9[164357]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:06 compute-0 sudo[164355]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:06 compute-0 sudo[164507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmsbrbuuipbbstkkhhfsczsygkfznghy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225146.1687074-277-96945502699295/AnsiballZ_lineinfile.py'
Jan 12 13:39:06 compute-0 sudo[164507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:06 compute-0 python3.9[164509]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:06 compute-0 sudo[164507]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:06 compute-0 sudo[164659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzdhvxwrsojsoliknjjcchtjwiykajho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225146.5908432-277-22415410887960/AnsiballZ_lineinfile.py'
Jan 12 13:39:06 compute-0 sudo[164659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:06 compute-0 python3.9[164661]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:06 compute-0 sudo[164659]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:07 compute-0 sudo[164811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzxstkpacokmhuqmmrisleevmgahxcdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225147.0391932-306-5758272897662/AnsiballZ_stat.py'
Jan 12 13:39:07 compute-0 sudo[164811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:07 compute-0 python3.9[164813]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:39:07 compute-0 sudo[164811]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:07 compute-0 sudo[164965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oivitdbozwzjiaqyjmawtolzbsprshyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225147.4986873-314-102613601145222/AnsiballZ_command.py'
Jan 12 13:39:07 compute-0 sudo[164965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:07 compute-0 python3.9[164967]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:07 compute-0 sudo[164965]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:08 compute-0 sudo[165118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqixsjxlgejofymheshqzkkkbyvyqstn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225148.0091383-323-226230423664031/AnsiballZ_systemd_service.py'
Jan 12 13:39:08 compute-0 sudo[165118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:08 compute-0 python3.9[165120]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:08 compute-0 systemd[1]: Listening on multipathd control socket.
Jan 12 13:39:08 compute-0 sudo[165118]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:08 compute-0 sudo[165274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brfvsyhgclyesjoduhxbzpjpygbrbrgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225148.6128855-331-221686835615002/AnsiballZ_systemd_service.py'
Jan 12 13:39:08 compute-0 sudo[165274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:09 compute-0 python3.9[165276]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:09 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 12 13:39:09 compute-0 udevadm[165281]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 12 13:39:09 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 12 13:39:09 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 12 13:39:09 compute-0 multipathd[165284]: --------start up--------
Jan 12 13:39:09 compute-0 multipathd[165284]: read /etc/multipath.conf
Jan 12 13:39:09 compute-0 multipathd[165284]: path checkers start up
Jan 12 13:39:09 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 12 13:39:09 compute-0 sudo[165274]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:09 compute-0 sudo[165441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgxaspwafqzlxctvqkiijgsuewkmhuat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225149.446837-343-3535930945023/AnsiballZ_file.py'
Jan 12 13:39:09 compute-0 sudo[165441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:09 compute-0 python3.9[165443]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 12 13:39:09 compute-0 sudo[165441]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:10 compute-0 sudo[165593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vddvbztiwpzseftjruiveljtetqroyiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225149.9246683-351-143224751974392/AnsiballZ_modprobe.py'
Jan 12 13:39:10 compute-0 sudo[165593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:10 compute-0 python3.9[165595]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 12 13:39:10 compute-0 kernel: Key type psk registered
Jan 12 13:39:10 compute-0 sudo[165593]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:10 compute-0 sudo[165756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mapiwkdsqehskytxwuvaqzofxvnfupyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225150.4152615-359-208779467946090/AnsiballZ_stat.py'
Jan 12 13:39:10 compute-0 sudo[165756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:10 compute-0 python3.9[165758]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:10 compute-0 sudo[165756]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:11 compute-0 sudo[165879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbrhaidwrttagxfqdrlmuzyyupybfosq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225150.4152615-359-208779467946090/AnsiballZ_copy.py'
Jan 12 13:39:11 compute-0 sudo[165879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:11 compute-0 python3.9[165881]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225150.4152615-359-208779467946090/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:11 compute-0 sudo[165879]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:11 compute-0 podman[165981]: 2026-01-12 13:39:11.546570298 +0000 UTC m=+0.041388789 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 12 13:39:11 compute-0 sudo[166048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrojqnbbxuotsssgpwmecbiukswlilxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225151.3817165-375-145006363650774/AnsiballZ_lineinfile.py'
Jan 12 13:39:11 compute-0 sudo[166048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:11 compute-0 python3.9[166051]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:11 compute-0 sudo[166048]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:12 compute-0 sudo[166201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-megoxrabvcoihtrctqbpilawdvwlvuql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225151.849063-383-93748793224673/AnsiballZ_systemd.py'
Jan 12 13:39:12 compute-0 sudo[166201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:12 compute-0 python3.9[166203]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:39:12 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 12 13:39:12 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 12 13:39:12 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 12 13:39:12 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 12 13:39:12 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 12 13:39:12 compute-0 sudo[166201]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:12 compute-0 sudo[166357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkcglzyvwxacnlopvilqtbhmslrpchey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225152.5026379-391-280898506209993/AnsiballZ_dnf.py'
Jan 12 13:39:12 compute-0 sudo[166357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:12 compute-0 python3.9[166359]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 12 13:39:14 compute-0 systemd[1]: Reloading.
Jan 12 13:39:14 compute-0 systemd-rc-local-generator[166387]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:39:14 compute-0 systemd-sysv-generator[166391]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:39:14 compute-0 systemd[1]: Reloading.
Jan 12 13:39:14 compute-0 systemd-rc-local-generator[166419]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:39:14 compute-0 systemd-sysv-generator[166422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:39:15 compute-0 systemd-logind[775]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 12 13:39:15 compute-0 systemd-logind[775]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 12 13:39:15 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 12 13:39:15 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 12 13:39:15 compute-0 systemd[1]: Reloading.
Jan 12 13:39:15 compute-0 systemd-rc-local-generator[166510]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:39:15 compute-0 systemd-sysv-generator[166515]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:39:15 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 12 13:39:15 compute-0 sudo[166357]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:16 compute-0 sudo[167780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onkgdkqwbqbcuwdueyzqfposnknokcel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225156.0150466-399-133772605602115/AnsiballZ_systemd_service.py'
Jan 12 13:39:16 compute-0 sudo[167780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:16 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 12 13:39:16 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 12 13:39:16 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.044s CPU time.
Jan 12 13:39:16 compute-0 systemd[1]: run-r2f978bda6c7941a89fb6590b8941ce43.service: Deactivated successfully.
Jan 12 13:39:16 compute-0 python3.9[167802]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:39:16 compute-0 systemd[1]: Stopping Open-iSCSI...
Jan 12 13:39:16 compute-0 iscsid[161336]: iscsid shutting down.
Jan 12 13:39:16 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Jan 12 13:39:16 compute-0 systemd[1]: Stopped Open-iSCSI.
Jan 12 13:39:16 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 12 13:39:16 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 12 13:39:16 compute-0 systemd[1]: Started Open-iSCSI.
Jan 12 13:39:16 compute-0 sudo[167780]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:16 compute-0 sudo[167969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avpmdiscvbcehmkpuawnovufuvkhjgpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225156.6407952-407-242215770670511/AnsiballZ_systemd_service.py'
Jan 12 13:39:16 compute-0 sudo[167969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:17 compute-0 python3.9[167971]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:39:17 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 12 13:39:17 compute-0 multipathd[165284]: exit (signal)
Jan 12 13:39:17 compute-0 multipathd[165284]: --------shut down-------
Jan 12 13:39:17 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Jan 12 13:39:17 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 12 13:39:17 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 12 13:39:17 compute-0 multipathd[167977]: --------start up--------
Jan 12 13:39:17 compute-0 multipathd[167977]: read /etc/multipath.conf
Jan 12 13:39:17 compute-0 multipathd[167977]: path checkers start up
Jan 12 13:39:17 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 12 13:39:17 compute-0 sudo[167969]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:17 compute-0 python3.9[168134]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:39:18 compute-0 sudo[168288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdxpfkcxpentdkkoxdbhqyywfhckhnxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225158.1988509-425-178273503588186/AnsiballZ_file.py'
Jan 12 13:39:18 compute-0 sudo[168288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:18 compute-0 python3.9[168290]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:18 compute-0 sudo[168288]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:19 compute-0 sudo[168440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjjithfhwarvbdgczaklbkbjxzzbofst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225158.8305886-436-114928587299619/AnsiballZ_systemd_service.py'
Jan 12 13:39:19 compute-0 sudo[168440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:19 compute-0 python3.9[168442]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:39:19 compute-0 systemd[1]: Reloading.
Jan 12 13:39:19 compute-0 systemd-rc-local-generator[168465]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:39:19 compute-0 systemd-sysv-generator[168469]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:39:19 compute-0 sudo[168440]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:19 compute-0 python3.9[168627]: ansible-ansible.builtin.service_facts Invoked
Jan 12 13:39:19 compute-0 network[168644]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 12 13:39:19 compute-0 network[168645]: 'network-scripts' will be removed from distribution in near future.
Jan 12 13:39:19 compute-0 network[168646]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 12 13:39:22 compute-0 sudo[168916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhqlqsdihsescuevkaanvxillqhuojth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225162.0976095-455-55766324643486/AnsiballZ_systemd_service.py'
Jan 12 13:39:22 compute-0 sudo[168916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:22 compute-0 python3.9[168918]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:22 compute-0 sudo[168916]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:22 compute-0 sudo[169069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yssxzveuedtunxgpokqkpulhkpglnbbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225162.6448338-455-125943512399178/AnsiballZ_systemd_service.py'
Jan 12 13:39:22 compute-0 sudo[169069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:23 compute-0 python3.9[169071]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:23 compute-0 sudo[169069]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:23 compute-0 sudo[169222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtgsuubhkpjipqtvqztlrjzatvknpcgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225163.2236688-455-57349739789819/AnsiballZ_systemd_service.py'
Jan 12 13:39:23 compute-0 sudo[169222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:23 compute-0 python3.9[169224]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:23 compute-0 sudo[169222]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:23 compute-0 sudo[169375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moxkrtjbgadtnsytqwyobbngyeufxsqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225163.757204-455-143072186172075/AnsiballZ_systemd_service.py'
Jan 12 13:39:23 compute-0 sudo[169375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:24 compute-0 python3.9[169377]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:24 compute-0 sudo[169375]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:24 compute-0 sudo[169528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggpnvdvrcsdydnolmiyhhldwxcpywtas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225164.3150449-455-42962829625756/AnsiballZ_systemd_service.py'
Jan 12 13:39:24 compute-0 sudo[169528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:24 compute-0 python3.9[169530]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:24 compute-0 sudo[169528]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:25 compute-0 sudo[169681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glkpwjycrywrudxpbhraopstrgrwvmay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225164.8489602-455-265551173292339/AnsiballZ_systemd_service.py'
Jan 12 13:39:25 compute-0 sudo[169681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:25 compute-0 python3.9[169683]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:25 compute-0 sudo[169681]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:25 compute-0 sudo[169834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjzbdtufcjczqkjcxlhfyukrzwzvcpuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225165.3965285-455-57381760066684/AnsiballZ_systemd_service.py'
Jan 12 13:39:25 compute-0 sudo[169834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:25 compute-0 python3.9[169836]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:25 compute-0 sudo[169834]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:26 compute-0 sudo[169987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjvhwpybbmxnicqaspuukjtsjvaxgwax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225165.9250383-455-237484180065360/AnsiballZ_systemd_service.py'
Jan 12 13:39:26 compute-0 sudo[169987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:26 compute-0 python3.9[169989]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:39:26 compute-0 sudo[169987]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:26 compute-0 sudo[170140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuhrehbuxckzxockibsrqhxugipuhcyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225166.5762105-514-123136896988070/AnsiballZ_file.py'
Jan 12 13:39:26 compute-0 sudo[170140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:27 compute-0 python3.9[170142]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:27 compute-0 sudo[170140]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:27 compute-0 sudo[170292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plwxqfmctbqkmuhxadmxckszzdetnbzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225167.1357307-514-99153883685281/AnsiballZ_file.py'
Jan 12 13:39:27 compute-0 sudo[170292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:27 compute-0 python3.9[170294]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:27 compute-0 sudo[170292]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:27 compute-0 sudo[170444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgtshpxyfmxosawthlzesbboaazyoiel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225167.5451572-514-41120682505312/AnsiballZ_file.py'
Jan 12 13:39:27 compute-0 sudo[170444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:27 compute-0 python3.9[170446]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:27 compute-0 sudo[170444]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:28 compute-0 sudo[170596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxmkgceipkbmqjneoavdkcsltxejggng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225167.9628189-514-177700793325944/AnsiballZ_file.py'
Jan 12 13:39:28 compute-0 sudo[170596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:28 compute-0 python3.9[170598]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:28 compute-0 sudo[170596]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:28 compute-0 sudo[170748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlnunwgexrcbribuhnmbkkcrlnfkxatv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225168.3840568-514-248272582431847/AnsiballZ_file.py'
Jan 12 13:39:28 compute-0 sudo[170748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:28 compute-0 python3.9[170750]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:28 compute-0 sudo[170748]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:28 compute-0 sudo[170900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyugjovunkaxgpssmmiizuoyjzggtuil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225168.8123834-514-236415268956690/AnsiballZ_file.py'
Jan 12 13:39:28 compute-0 sudo[170900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:29 compute-0 python3.9[170902]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:29 compute-0 sudo[170900]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:29 compute-0 sudo[171052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgixyxpktznzemmbxgbvldqjjdlisaej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225169.2404494-514-183642422715666/AnsiballZ_file.py'
Jan 12 13:39:29 compute-0 sudo[171052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:29 compute-0 python3.9[171054]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:29 compute-0 sudo[171052]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:29 compute-0 sudo[171204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqhkfwrxyqviqcjouqsygcniciuosrlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225169.6573734-514-119471329699634/AnsiballZ_file.py'
Jan 12 13:39:29 compute-0 sudo[171204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:29 compute-0 python3.9[171206]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:29 compute-0 sudo[171204]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:30 compute-0 sudo[171356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpoexsrphmbmdauikqpzkhiqyawdlwuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225170.1073523-571-272887608414033/AnsiballZ_file.py'
Jan 12 13:39:30 compute-0 sudo[171356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:30 compute-0 python3.9[171358]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:30 compute-0 sudo[171356]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:30 compute-0 sudo[171508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxwbwpqfzgshlmkwksvxdprdeczeidpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225170.551217-571-244470557444408/AnsiballZ_file.py'
Jan 12 13:39:30 compute-0 sudo[171508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:30 compute-0 python3.9[171510]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:30 compute-0 sudo[171508]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:31 compute-0 sudo[171660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqrhheyynrmudxasetyimwmxgjeonwby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225170.9820473-571-174808068878455/AnsiballZ_file.py'
Jan 12 13:39:31 compute-0 sudo[171660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:31 compute-0 python3.9[171662]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:31 compute-0 sudo[171660]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:31 compute-0 sudo[171812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcfacsttdiyrdwfiykcsomschehtibye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225171.467846-571-255912940007275/AnsiballZ_file.py'
Jan 12 13:39:31 compute-0 sudo[171812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:31 compute-0 python3.9[171814]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:31 compute-0 sudo[171812]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:32 compute-0 sudo[171964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juxofzipwmwhhrqxdhlwljmhbisdtxxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225171.890281-571-105108073032470/AnsiballZ_file.py'
Jan 12 13:39:32 compute-0 sudo[171964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:32 compute-0 python3.9[171966]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:32 compute-0 sudo[171964]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:32 compute-0 sudo[172116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcdxmzikuvauqsraalbybehdwnnftefm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225172.3407247-571-207224890564175/AnsiballZ_file.py'
Jan 12 13:39:32 compute-0 sudo[172116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:32 compute-0 python3.9[172118]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:32 compute-0 sudo[172116]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:32 compute-0 sudo[172277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odzgsuwifnnetwbwynpbgoojtohrgkfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225172.783891-571-248757142737245/AnsiballZ_file.py'
Jan 12 13:39:32 compute-0 sudo[172277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:33 compute-0 podman[172242]: 2026-01-12 13:39:33.004495535 +0000 UTC m=+0.059321845 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 12 13:39:33 compute-0 python3.9[172288]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:33 compute-0 sudo[172277]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:33 compute-0 sudo[172444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmkjomukdbhuxnhsroexzlnsdnlxqtti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225173.2444937-571-212482492964547/AnsiballZ_file.py'
Jan 12 13:39:33 compute-0 sudo[172444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:33 compute-0 python3.9[172446]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:33 compute-0 sudo[172444]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:33 compute-0 sudo[172596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edwvfodcqxdognorpbpytkrwyeqlbivs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225173.7696807-629-42735128175213/AnsiballZ_command.py'
Jan 12 13:39:33 compute-0 sudo[172596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:34 compute-0 python3.9[172598]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:34 compute-0 sudo[172596]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:34 compute-0 python3.9[172750]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 12 13:39:35 compute-0 sudo[172900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cggmbbqoezixxnqzlhmxopjhpswadwqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225174.887874-647-40775049212035/AnsiballZ_systemd_service.py'
Jan 12 13:39:35 compute-0 sudo[172900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:35 compute-0 python3.9[172902]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:39:35 compute-0 systemd[1]: Reloading.
Jan 12 13:39:35 compute-0 systemd-rc-local-generator[172923]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:39:35 compute-0 systemd-sysv-generator[172926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:39:35 compute-0 sudo[172900]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:35 compute-0 sudo[173087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wticwppgfwihpmrzeiywszwrxxuuydmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225175.6617606-655-139191059450945/AnsiballZ_command.py'
Jan 12 13:39:35 compute-0 sudo[173087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:36 compute-0 python3.9[173089]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:36 compute-0 sudo[173087]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:36 compute-0 sudo[173240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phtfigkuyiidisswgphfwnojzxedtgfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225176.2213259-655-28265844171813/AnsiballZ_command.py'
Jan 12 13:39:36 compute-0 sudo[173240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:36 compute-0 python3.9[173242]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:36 compute-0 sudo[173240]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:36 compute-0 sudo[173393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxewnkundocxbvolsntvkktkcofhyenz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225176.638468-655-242956472163759/AnsiballZ_command.py'
Jan 12 13:39:36 compute-0 sudo[173393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:36 compute-0 python3.9[173395]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:36 compute-0 sudo[173393]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:37 compute-0 sudo[173546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgqhbqviuzgzesbwpzkqriflvvenvlvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225177.0864377-655-104800806650256/AnsiballZ_command.py'
Jan 12 13:39:37 compute-0 sudo[173546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:37 compute-0 python3.9[173548]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:37 compute-0 sudo[173546]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:37 compute-0 sudo[173699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azbojmpzlmdpnehvqshawnzfzrjzvnsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225177.5566247-655-269647114770140/AnsiballZ_command.py'
Jan 12 13:39:37 compute-0 sudo[173699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:37 compute-0 python3.9[173701]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:37 compute-0 sudo[173699]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:38 compute-0 sudo[173852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdibxzzuzntapsjaccxtiqftwakufjpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225177.9870064-655-189124968497710/AnsiballZ_command.py'
Jan 12 13:39:38 compute-0 sudo[173852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:38 compute-0 python3.9[173854]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:38 compute-0 sudo[173852]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:38 compute-0 sudo[174005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwlyfzwwaizmshugdmsjuvtzerdxwart ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225178.414793-655-138693769935872/AnsiballZ_command.py'
Jan 12 13:39:38 compute-0 sudo[174005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:38 compute-0 python3.9[174007]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:38 compute-0 sudo[174005]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:39 compute-0 sudo[174158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irzkzecsrnoecmhtulteqzsqztfrspgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225178.8422256-655-234899290844216/AnsiballZ_command.py'
Jan 12 13:39:39 compute-0 sudo[174158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:39 compute-0 python3.9[174160]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:39:39 compute-0 sudo[174158]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:40 compute-0 sudo[174311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziqcixlnqnskhvfzkqzgzafytapxpsfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225179.886487-734-137112104842820/AnsiballZ_file.py'
Jan 12 13:39:40 compute-0 sudo[174311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:39:40.192 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:39:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:39:40.193 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:39:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:39:40.193 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:39:40 compute-0 python3.9[174313]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:40 compute-0 sudo[174311]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:40 compute-0 sudo[174463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioidqwnxxuxibwjiuanmshlizsakntkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225180.331399-734-60128921093078/AnsiballZ_file.py'
Jan 12 13:39:40 compute-0 sudo[174463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:40 compute-0 python3.9[174465]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:40 compute-0 sudo[174463]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:40 compute-0 sudo[174615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwhosknqeaahvsyacuvipwwwxyhrguou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225180.7704911-734-142918265380729/AnsiballZ_file.py'
Jan 12 13:39:40 compute-0 sudo[174615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:41 compute-0 python3.9[174617]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:41 compute-0 sudo[174615]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:41 compute-0 sudo[174767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-horzussyhzsmhnmgjylmpjetsgbwqlaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225181.2446926-756-68001171930889/AnsiballZ_file.py'
Jan 12 13:39:41 compute-0 sudo[174767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:41 compute-0 python3.9[174769]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:41 compute-0 sudo[174767]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:41 compute-0 podman[174770]: 2026-01-12 13:39:41.64277142 +0000 UTC m=+0.059735789 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 12 13:39:41 compute-0 sudo[174934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgftpmarvfryepikrzjbcbunmcmjtajf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225181.700182-756-35010672492661/AnsiballZ_file.py'
Jan 12 13:39:41 compute-0 sudo[174934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:42 compute-0 python3.9[174936]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:42 compute-0 sudo[174934]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:42 compute-0 sudo[175086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsqurujwhvylrerjbpgeqclassfloxsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225182.1240156-756-37919341154905/AnsiballZ_file.py'
Jan 12 13:39:42 compute-0 sudo[175086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:42 compute-0 python3.9[175088]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:42 compute-0 sudo[175086]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:42 compute-0 sudo[175238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqckqbfpyjeylfarylxldjkqhbnovzil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225182.5450358-756-51060358522753/AnsiballZ_file.py'
Jan 12 13:39:42 compute-0 sudo[175238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:42 compute-0 python3.9[175240]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:42 compute-0 sudo[175238]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:43 compute-0 sudo[175390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmewmdiimnibnqgglouwyajnzfwekszy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225182.9506876-756-6827948128127/AnsiballZ_file.py'
Jan 12 13:39:43 compute-0 sudo[175390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:43 compute-0 python3.9[175392]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:43 compute-0 sudo[175390]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:43 compute-0 sudo[175542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sddegcswdskepoqnubfvzafcenbbiztv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225183.359637-756-142684537949978/AnsiballZ_file.py'
Jan 12 13:39:43 compute-0 sudo[175542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:43 compute-0 python3.9[175544]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:43 compute-0 sudo[175542]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:43 compute-0 sudo[175694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrbyskcfydvjtgwautknyaqkuslgdish ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225183.7875063-756-175175308678296/AnsiballZ_file.py'
Jan 12 13:39:43 compute-0 sudo[175694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:44 compute-0 python3.9[175696]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:44 compute-0 sudo[175694]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:47 compute-0 sudo[175846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhlomsxoimfttkwaypekuycvmxhimgro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225187.3389146-925-270152795101728/AnsiballZ_getent.py'
Jan 12 13:39:47 compute-0 sudo[175846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:47 compute-0 python3.9[175848]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 12 13:39:47 compute-0 sudo[175846]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:48 compute-0 sudo[175999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llcfnanmzfcwkoglahgcqnnelpuaicqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225187.9783607-933-34560186705154/AnsiballZ_group.py'
Jan 12 13:39:48 compute-0 sudo[175999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:48 compute-0 python3.9[176001]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 12 13:39:48 compute-0 groupadd[176002]: group added to /etc/group: name=nova, GID=42436
Jan 12 13:39:48 compute-0 groupadd[176002]: group added to /etc/gshadow: name=nova
Jan 12 13:39:48 compute-0 groupadd[176002]: new group: name=nova, GID=42436
Jan 12 13:39:48 compute-0 sudo[175999]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:48 compute-0 sudo[176157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ermwvdukbwcmyovbonavbzlbbnyrsyiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225188.6219037-941-253517571043800/AnsiballZ_user.py'
Jan 12 13:39:48 compute-0 sudo[176157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:49 compute-0 python3.9[176159]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 12 13:39:49 compute-0 useradd[176161]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 12 13:39:49 compute-0 useradd[176161]: add 'nova' to group 'libvirt'
Jan 12 13:39:49 compute-0 useradd[176161]: add 'nova' to shadow group 'libvirt'
Jan 12 13:39:49 compute-0 sudo[176157]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:49 compute-0 sshd-session[176192]: Accepted publickey for zuul from 192.168.122.30 port 33276 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:39:49 compute-0 systemd-logind[775]: New session 24 of user zuul.
Jan 12 13:39:49 compute-0 systemd[1]: Started Session 24 of User zuul.
Jan 12 13:39:49 compute-0 sshd-session[176192]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:39:50 compute-0 sshd-session[176195]: Received disconnect from 192.168.122.30 port 33276:11: disconnected by user
Jan 12 13:39:50 compute-0 sshd-session[176195]: Disconnected from user zuul 192.168.122.30 port 33276
Jan 12 13:39:50 compute-0 sshd-session[176192]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:39:50 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Jan 12 13:39:50 compute-0 systemd-logind[775]: Session 24 logged out. Waiting for processes to exit.
Jan 12 13:39:50 compute-0 systemd-logind[775]: Removed session 24.
Jan 12 13:39:50 compute-0 python3.9[176345]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:50 compute-0 python3.9[176466]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225190.1530745-966-176719858074556/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:51 compute-0 python3.9[176616]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:51 compute-0 python3.9[176692]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:52 compute-0 python3.9[176842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:52 compute-0 python3.9[176963]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225191.7332203-966-185970795887231/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:52 compute-0 python3.9[177113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:53 compute-0 python3.9[177234]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225192.5844297-966-260840493038094/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:53 compute-0 python3.9[177384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:54 compute-0 python3.9[177505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225193.4415932-966-246905748348754/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:54 compute-0 python3.9[177655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:54 compute-0 python3.9[177776]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225194.2493083-966-45657386907880/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:55 compute-0 sudo[177926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gatktvcjfxxfucfxshneaurgoessixvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225195.154654-1049-239649085094169/AnsiballZ_file.py'
Jan 12 13:39:55 compute-0 sudo[177926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:55 compute-0 python3.9[177928]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:55 compute-0 sudo[177926]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:55 compute-0 sudo[178078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvoabaqwkkbcmlgpmtasvvdpbgwnhlmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225195.6114154-1057-271270025062349/AnsiballZ_copy.py'
Jan 12 13:39:55 compute-0 sudo[178078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:55 compute-0 python3.9[178080]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:39:55 compute-0 sudo[178078]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:56 compute-0 sudo[178230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvlroklqihmgtqmmxptwnwqbfmkrerix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225196.1989937-1065-18656876494137/AnsiballZ_stat.py'
Jan 12 13:39:56 compute-0 sudo[178230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:56 compute-0 python3.9[178232]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:39:56 compute-0 sudo[178230]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:56 compute-0 sudo[178382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycouztbltxrkqibrndfsmkgpwgyqgcye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225196.6736403-1073-121307428094872/AnsiballZ_stat.py'
Jan 12 13:39:56 compute-0 sudo[178382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:57 compute-0 python3.9[178384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:57 compute-0 sudo[178382]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:57 compute-0 sudo[178505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wenhuyhxyunlrjftyylkjcmsrlfxienq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225196.6736403-1073-121307428094872/AnsiballZ_copy.py'
Jan 12 13:39:57 compute-0 sudo[178505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:39:57 compute-0 python3.9[178507]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1768225196.6736403-1073-121307428094872/.source _original_basename=.gocpne_c follow=False checksum=8756f51a20db8cc9a432aa2b0b5a04bd19089e79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 12 13:39:57 compute-0 sudo[178505]: pam_unix(sudo:session): session closed for user root
Jan 12 13:39:57 compute-0 python3.9[178659]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:39:58 compute-0 python3.9[178811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:58 compute-0 python3.9[178932]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225198.0810127-1099-3820245962350/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=53b8456782b81b5794d3eef3fadcfb00db1088a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:39:59 compute-0 python3.9[179082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:39:59 compute-0 python3.9[179203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225198.9320877-1114-22561059085172/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0333d3a3f5c3a0526b0ebe430250032166710e8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:00 compute-0 sudo[179353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wykaezcgntvumbyscqoufhrkgdxruhwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225199.8864524-1131-61131910446342/AnsiballZ_container_config_data.py'
Jan 12 13:40:00 compute-0 sudo[179353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:00 compute-0 python3.9[179355]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 12 13:40:00 compute-0 sudo[179353]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:01 compute-0 sudo[179505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-attstutsrjwhgfzqcocvxsjnrxnkfchg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225200.728846-1142-204596532971138/AnsiballZ_container_config_hash.py'
Jan 12 13:40:01 compute-0 sudo[179505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:01 compute-0 python3.9[179507]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 12 13:40:01 compute-0 sudo[179505]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:01 compute-0 sudo[179657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfogboglbjofjugsreypllnidxohgoax ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768225201.4448729-1152-138076166086136/AnsiballZ_edpm_container_manage.py'
Jan 12 13:40:01 compute-0 sudo[179657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:02 compute-0 python3[179659]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 12 13:40:02 compute-0 podman[179688]: 2026-01-12 13:40:02.155543788 +0000 UTC m=+0.028203744 container create 951311c2e62a7f7883fd58074e02994a393f93afc4692fd03066c8d1a1db5134 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 12 13:40:02 compute-0 podman[179688]: 2026-01-12 13:40:02.141667288 +0000 UTC m=+0.014327264 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 12 13:40:02 compute-0 python3[179659]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 12 13:40:02 compute-0 sudo[179657]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:02 compute-0 sudo[179865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfawponnyjdgojqngfnvriaszvpfqlrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225202.3660672-1160-153223367728818/AnsiballZ_stat.py'
Jan 12 13:40:02 compute-0 sudo[179865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:02 compute-0 python3.9[179867]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:02 compute-0 sudo[179865]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:03 compute-0 sudo[180028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofzkqtmidcilvstngcidvpdfzgbrakog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225203.0401683-1172-128217565359907/AnsiballZ_container_config_data.py'
Jan 12 13:40:03 compute-0 sudo[180028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:03 compute-0 podman[179993]: 2026-01-12 13:40:03.268407362 +0000 UTC m=+0.067654768 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 12 13:40:03 compute-0 python3.9[180035]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 12 13:40:03 compute-0 sudo[180028]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:03 compute-0 sudo[180194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obwjhnfhvgrbxxytimzhebjixpbhwvct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225203.6403353-1183-154561492511842/AnsiballZ_container_config_hash.py'
Jan 12 13:40:03 compute-0 sudo[180194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:03 compute-0 python3.9[180196]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 12 13:40:03 compute-0 sudo[180194]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:04 compute-0 sudo[180346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uurilyqjmqdfxbtfwhilhfaucprcagvu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768225204.200423-1193-113240946653008/AnsiballZ_edpm_container_manage.py'
Jan 12 13:40:04 compute-0 sudo[180346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:04 compute-0 python3[180348]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 12 13:40:04 compute-0 podman[180377]: 2026-01-12 13:40:04.758948829 +0000 UTC m=+0.027816158 container create e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=edpm)
Jan 12 13:40:04 compute-0 podman[180377]: 2026-01-12 13:40:04.74593913 +0000 UTC m=+0.014806480 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 12 13:40:04 compute-0 python3[180348]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b kolla_start
Jan 12 13:40:04 compute-0 sudo[180346]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:05 compute-0 sudo[180554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-allhlyvnxdawndonsentwelcacpyghek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225205.0935838-1201-155326019440602/AnsiballZ_stat.py'
Jan 12 13:40:05 compute-0 sudo[180554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:05 compute-0 python3.9[180556]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:05 compute-0 sudo[180554]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:05 compute-0 sudo[180708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqihekjjdzuocnaybwjektyelgodjquf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225205.6257944-1210-118490769885731/AnsiballZ_file.py'
Jan 12 13:40:05 compute-0 sudo[180708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:05 compute-0 python3.9[180710]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:05 compute-0 sudo[180708]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:06 compute-0 sudo[180859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emncsstbgwakazirtioawifdrptmaode ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225206.0134988-1210-192594787252828/AnsiballZ_copy.py'
Jan 12 13:40:06 compute-0 sudo[180859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:06 compute-0 python3.9[180861]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768225206.0134988-1210-192594787252828/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:06 compute-0 sudo[180859]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:06 compute-0 sudo[180935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djrrwjlbsccumnqmohggjjglmnbvtmhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225206.0134988-1210-192594787252828/AnsiballZ_systemd.py'
Jan 12 13:40:06 compute-0 sudo[180935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:06 compute-0 python3.9[180937]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:40:06 compute-0 systemd[1]: Reloading.
Jan 12 13:40:06 compute-0 systemd-rc-local-generator[180958]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:40:06 compute-0 systemd-sysv-generator[180962]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:40:07 compute-0 sudo[180935]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:07 compute-0 sudo[181047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfcypaksdlnbzqsjgmxjjanfpjlcozqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225206.0134988-1210-192594787252828/AnsiballZ_systemd.py'
Jan 12 13:40:07 compute-0 sudo[181047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:07 compute-0 python3.9[181049]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:40:07 compute-0 systemd[1]: Reloading.
Jan 12 13:40:07 compute-0 systemd-rc-local-generator[181072]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:40:07 compute-0 systemd-sysv-generator[181075]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:40:07 compute-0 systemd[1]: Starting nova_compute container...
Jan 12 13:40:07 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:07 compute-0 podman[181088]: 2026-01-12 13:40:07.84113215 +0000 UTC m=+0.067195175 container init e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 12 13:40:07 compute-0 podman[181088]: 2026-01-12 13:40:07.847950023 +0000 UTC m=+0.074013049 container start e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute)
Jan 12 13:40:07 compute-0 podman[181088]: nova_compute
Jan 12 13:40:07 compute-0 nova_compute[181100]: + sudo -E kolla_set_configs
Jan 12 13:40:07 compute-0 systemd[1]: Started nova_compute container.
Jan 12 13:40:07 compute-0 sudo[181047]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Validating config file
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Copying service configuration files
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Deleting /etc/ceph
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Creating directory /etc/ceph
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /etc/ceph
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Writing out command to execute
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 12 13:40:07 compute-0 nova_compute[181100]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 12 13:40:07 compute-0 nova_compute[181100]: ++ cat /run_command
Jan 12 13:40:07 compute-0 nova_compute[181100]: + CMD=nova-compute
Jan 12 13:40:07 compute-0 nova_compute[181100]: + ARGS=
Jan 12 13:40:07 compute-0 nova_compute[181100]: + sudo kolla_copy_cacerts
Jan 12 13:40:07 compute-0 nova_compute[181100]: + [[ ! -n '' ]]
Jan 12 13:40:07 compute-0 nova_compute[181100]: + . kolla_extend_start
Jan 12 13:40:07 compute-0 nova_compute[181100]: + echo 'Running command: '\''nova-compute'\'''
Jan 12 13:40:07 compute-0 nova_compute[181100]: Running command: 'nova-compute'
Jan 12 13:40:07 compute-0 nova_compute[181100]: + umask 0022
Jan 12 13:40:07 compute-0 nova_compute[181100]: + exec nova-compute
Jan 12 13:40:08 compute-0 python3.9[181261]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:09 compute-0 python3.9[181412]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:09 compute-0 nova_compute[181100]: 2026-01-12 13:40:09.543 181104 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 12 13:40:09 compute-0 nova_compute[181100]: 2026-01-12 13:40:09.543 181104 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 12 13:40:09 compute-0 nova_compute[181100]: 2026-01-12 13:40:09.544 181104 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 12 13:40:09 compute-0 nova_compute[181100]: 2026-01-12 13:40:09.544 181104 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 12 13:40:09 compute-0 python3.9[181562]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:09 compute-0 nova_compute[181100]: 2026-01-12 13:40:09.647 181104 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:40:09 compute-0 nova_compute[181100]: 2026-01-12 13:40:09.656 181104 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:40:09 compute-0 nova_compute[181100]: 2026-01-12 13:40:09.657 181104 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 12 13:40:10 compute-0 sudo[181716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwrcibnroupgrgyqzbtmzfzfsovqpcnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225209.9326062-1270-59045959326141/AnsiballZ_podman_container.py'
Jan 12 13:40:10 compute-0 sudo[181716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.307 181104 INFO nova.virt.driver [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.391 181104 INFO nova.compute.provider_config [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.402 181104 DEBUG oslo_concurrency.lockutils [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.402 181104 DEBUG oslo_concurrency.lockutils [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.403 181104 DEBUG oslo_concurrency.lockutils [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.403 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.403 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.403 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.403 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.404 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.404 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.404 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.404 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.404 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.404 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.405 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.405 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.405 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.405 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.405 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.406 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.406 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.406 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.406 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.406 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.406 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.407 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.407 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.407 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.407 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.407 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.407 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.408 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.408 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.408 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.408 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.408 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.409 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.409 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.409 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.409 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.409 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.409 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.410 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.410 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.410 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.410 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.410 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.411 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.411 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.411 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.411 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.411 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.411 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.412 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.412 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.412 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.412 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.412 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.413 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.413 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.413 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.413 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.413 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.413 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.414 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.414 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.414 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.414 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.414 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.414 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.415 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.415 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.415 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.415 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.415 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.415 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.416 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.416 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.416 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.416 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.416 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.417 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.417 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.417 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.417 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.417 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.417 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.418 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.418 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.418 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.418 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.418 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.418 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.419 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.419 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.419 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.419 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.419 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.419 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.420 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.420 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.420 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.420 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.420 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.420 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.421 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.421 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.421 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.421 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.421 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.421 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.422 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.422 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.422 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.422 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.422 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.423 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.423 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.423 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.423 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.423 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.423 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.424 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.424 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.424 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.424 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.424 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.424 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.425 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.425 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.425 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.425 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.425 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.425 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.426 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.426 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.426 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.426 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.426 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.426 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.427 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.427 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.427 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.427 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.427 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.427 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.428 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.428 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.428 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.428 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.428 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.428 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.429 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.429 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.429 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.429 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.429 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.430 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.430 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.430 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.430 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.430 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.431 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.431 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.431 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.431 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.431 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.431 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.432 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.432 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.432 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.432 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.432 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.432 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.433 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.433 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.433 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.433 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.433 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.434 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.434 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.434 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.434 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.434 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.434 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.435 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.435 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.435 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.435 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.435 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.436 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.436 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.436 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.436 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.436 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.436 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.437 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.437 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.437 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.437 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.437 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.437 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.438 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.438 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.438 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.438 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.438 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.438 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.439 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.439 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.439 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.439 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.439 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.440 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.440 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.440 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.440 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.440 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.440 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.441 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.441 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.441 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.441 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.441 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.441 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.442 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.442 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.442 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.442 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.442 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.442 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.443 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.443 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.443 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.443 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.443 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.444 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.444 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.444 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.444 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.444 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.444 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.445 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.445 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.445 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.445 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.445 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.445 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.446 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.446 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.446 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.446 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.446 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.447 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.447 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.447 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.447 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.447 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.447 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.448 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.448 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.448 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.448 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.448 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.448 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.449 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.449 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.449 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.449 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.449 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.449 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.450 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.450 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.450 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.450 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.450 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.450 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.451 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.451 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.451 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.451 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.451 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.452 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.452 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.452 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.452 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.452 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.452 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.453 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.453 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.453 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.453 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.453 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.453 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.454 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.454 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.454 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.454 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.454 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.454 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.455 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.455 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.455 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.455 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.455 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.456 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.456 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.456 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.456 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.456 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.456 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.457 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.457 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.457 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.457 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.457 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.457 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.458 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.458 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.458 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.458 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.458 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.459 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.459 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.459 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.459 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.459 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.459 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.460 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.460 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.460 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.460 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.460 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.460 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.461 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.461 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.461 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.461 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.461 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.461 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.462 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.462 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.462 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.462 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.462 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.463 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.463 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.463 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.463 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.463 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.463 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.464 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.464 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.464 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.464 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.464 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.464 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.465 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.465 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.465 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.465 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.466 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.466 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.466 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.466 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.466 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.466 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.467 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.467 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.467 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.467 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.467 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.467 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.468 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.468 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.468 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.468 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.468 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.469 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.469 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.469 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.469 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.469 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.469 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.470 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.470 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.470 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.470 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.470 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.470 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.471 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.471 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.471 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.471 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.471 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.471 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.472 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.472 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.472 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.472 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.472 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.473 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.473 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.473 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.473 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.473 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.473 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.474 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.474 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.474 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.474 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.474 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.474 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.475 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.475 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.475 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.475 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.475 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.475 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.476 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.476 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.476 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.476 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.476 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.477 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.477 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.477 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.477 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.477 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.477 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.478 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.478 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.478 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.478 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.478 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.478 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.479 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.479 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.479 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.479 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.479 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.479 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.480 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.480 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.480 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.480 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.480 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.480 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.481 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.481 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.481 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.481 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.481 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.482 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.482 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.482 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.482 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.482 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.482 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.483 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.483 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.483 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.483 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.483 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.483 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.484 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.484 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.484 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.484 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.484 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.485 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.485 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.485 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.485 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.485 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.485 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.486 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.486 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.486 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.486 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.486 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.486 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.487 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.487 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.487 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.487 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.487 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.488 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.488 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.488 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.488 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.488 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.488 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.489 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.489 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.489 181104 WARNING oslo_config.cfg [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 12 13:40:10 compute-0 nova_compute[181100]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 12 13:40:10 compute-0 nova_compute[181100]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 12 13:40:10 compute-0 nova_compute[181100]: and ``live_migration_inbound_addr`` respectively.
Jan 12 13:40:10 compute-0 nova_compute[181100]: ).  Its value may be silently ignored in the future.
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.489 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.489 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.490 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.490 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.490 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.490 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.490 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.491 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.491 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.491 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.491 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.491 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.491 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.492 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.492 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.492 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.492 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.492 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.493 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.493 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.493 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.493 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.493 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.493 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.494 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.494 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.494 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.494 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.494 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.495 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.495 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.495 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.495 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.495 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.495 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.496 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.496 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.496 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.496 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.496 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.497 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.497 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.497 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.497 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.497 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.497 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.498 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.498 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.498 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.498 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.498 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.498 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.499 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.499 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.499 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.499 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.499 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.500 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.500 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.500 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.500 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.500 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.500 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.501 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.501 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.501 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.501 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.501 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.501 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.502 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.502 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.502 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.502 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.502 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.502 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.503 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.503 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.503 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.503 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.503 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.503 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.504 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.504 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.504 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.504 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.504 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.505 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 python3.9[181718]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.505 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.505 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.505 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.505 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.505 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.506 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.506 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.506 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.506 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.506 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.506 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.507 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.507 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.507 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.507 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.507 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.507 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.508 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.508 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.508 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.508 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.508 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.508 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.509 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.509 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.509 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.509 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.510 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.510 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.510 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.510 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.510 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.510 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.511 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.511 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.511 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.511 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.511 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.511 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.512 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.512 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.512 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.512 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.512 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.512 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.513 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.513 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.513 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.513 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.513 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.513 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.514 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.514 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.514 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.514 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.514 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.514 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.515 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.515 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.515 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.516 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.516 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.516 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.516 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.516 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.516 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.517 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.517 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.517 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.517 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.517 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.518 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.518 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.518 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.518 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.518 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.518 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.519 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.519 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.519 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.519 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.519 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.519 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.519 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.520 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.520 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.520 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.520 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.520 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.520 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.521 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.521 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.521 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.521 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.521 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.522 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.522 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.522 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.522 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.522 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.522 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.523 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.523 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.523 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.523 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.523 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.523 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.524 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.524 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.524 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.524 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.524 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.525 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.525 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.525 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.525 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.525 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.525 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.526 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.526 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.526 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.526 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.527 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.527 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.527 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.527 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.528 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.528 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.528 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.528 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.528 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.528 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.529 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.529 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.529 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.529 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.529 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.530 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.530 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.530 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.530 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.530 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.530 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.531 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.531 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.531 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.531 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.531 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.531 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.532 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.532 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.532 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.532 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.532 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.532 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.533 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.533 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.533 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.533 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.533 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.533 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.534 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.534 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.534 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.534 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.535 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.535 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.535 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.535 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.535 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.535 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.536 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.536 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.536 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.536 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.536 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.536 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.537 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.537 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.537 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.537 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.537 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.537 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.538 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.538 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.538 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.538 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.538 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.539 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.539 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.539 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.539 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.539 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.539 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.540 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.540 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.540 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.540 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.540 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.540 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.541 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.541 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.541 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.541 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.541 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.542 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.542 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.542 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.542 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.542 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.542 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.543 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.543 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.543 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.543 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.543 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.543 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.544 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.544 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.544 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.544 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.544 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.545 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.545 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.545 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.545 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.545 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.545 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.546 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.546 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.546 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.546 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.546 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.546 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.547 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.547 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.547 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.547 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.547 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.547 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.548 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.548 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.548 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.548 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.548 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.549 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.549 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.549 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.549 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.549 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.549 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.550 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.550 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.550 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.550 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.550 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.550 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.551 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.551 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.551 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.551 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.551 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.551 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.552 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.552 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.552 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.552 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.552 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.552 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.553 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.553 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.553 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.553 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.553 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.553 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.554 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.554 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.554 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.554 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.554 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.555 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.555 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.555 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.555 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.555 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.555 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.556 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.556 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.556 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.556 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.556 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.556 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.557 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.557 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.557 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.557 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.557 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.557 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.558 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.558 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.558 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.558 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.558 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.558 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.559 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.559 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.559 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.559 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.559 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.560 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.560 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.560 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.560 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.560 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.560 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.561 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.561 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.561 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.561 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.561 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.561 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.562 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.562 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.562 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.562 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.562 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.562 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.563 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.563 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] privsep_osbrick.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.563 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.563 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.563 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.563 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.564 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.564 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] nova_sys_admin.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.564 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.564 181104 DEBUG oslo_service.service [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.565 181104 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 12 13:40:10 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.576 181104 DEBUG nova.virt.libvirt.host [None req-d1520507-a006-44c1-b33b-b94cc1a25239 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.577 181104 DEBUG nova.virt.libvirt.host [None req-d1520507-a006-44c1-b33b-b94cc1a25239 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.577 181104 DEBUG nova.virt.libvirt.host [None req-d1520507-a006-44c1-b33b-b94cc1a25239 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.577 181104 DEBUG nova.virt.libvirt.host [None req-d1520507-a006-44c1-b33b-b94cc1a25239 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 12 13:40:10 compute-0 sudo[181716]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.585 181104 DEBUG nova.virt.libvirt.host [None req-d1520507-a006-44c1-b33b-b94cc1a25239 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f054f27b970> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.588 181104 DEBUG nova.virt.libvirt.host [None req-d1520507-a006-44c1-b33b-b94cc1a25239 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f054f27b970> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.588 181104 INFO nova.virt.libvirt.driver [None req-d1520507-a006-44c1-b33b-b94cc1a25239 - - - - - -] Connection event '1' reason 'None'
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.603 181104 WARNING nova.virt.libvirt.driver [None req-d1520507-a006-44c1-b33b-b94cc1a25239 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 12 13:40:10 compute-0 nova_compute[181100]: 2026-01-12 13:40:10.604 181104 DEBUG nova.virt.libvirt.volume.mount [None req-d1520507-a006-44c1-b33b-b94cc1a25239 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 12 13:40:10 compute-0 sudo[181916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmcbqwxjzefntfysmlxbhozxeefjcwsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225210.7400591-1278-20747274438168/AnsiballZ_systemd.py'
Jan 12 13:40:10 compute-0 sudo[181916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:11 compute-0 python3.9[181918]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 12 13:40:11 compute-0 systemd[1]: Stopping nova_compute container...
Jan 12 13:40:11 compute-0 nova_compute[181100]: 2026-01-12 13:40:11.259 181104 DEBUG oslo_concurrency.lockutils [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:40:11 compute-0 nova_compute[181100]: 2026-01-12 13:40:11.259 181104 DEBUG oslo_concurrency.lockutils [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:40:11 compute-0 nova_compute[181100]: 2026-01-12 13:40:11.259 181104 DEBUG oslo_concurrency.lockutils [None req-8c6e3827-384c-47f8-b723-7af6ce662702 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:40:11 compute-0 virtqemud[153584]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Jan 12 13:40:11 compute-0 virtqemud[153584]: hostname: compute-0
Jan 12 13:40:11 compute-0 virtqemud[153584]: End of file while reading data: Input/output error
Jan 12 13:40:11 compute-0 systemd[1]: libpod-e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc.scope: Deactivated successfully.
Jan 12 13:40:11 compute-0 systemd[1]: libpod-e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc.scope: Consumed 2.344s CPU time.
Jan 12 13:40:11 compute-0 podman[181930]: 2026-01-12 13:40:11.512484919 +0000 UTC m=+0.279626332 container died e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:40:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc-userdata-shm.mount: Deactivated successfully.
Jan 12 13:40:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92-merged.mount: Deactivated successfully.
Jan 12 13:40:11 compute-0 podman[181930]: 2026-01-12 13:40:11.544263719 +0000 UTC m=+0.311405132 container cleanup e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Jan 12 13:40:11 compute-0 podman[181930]: nova_compute
Jan 12 13:40:11 compute-0 podman[181955]: nova_compute
Jan 12 13:40:11 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 12 13:40:11 compute-0 systemd[1]: Stopped nova_compute container.
Jan 12 13:40:11 compute-0 systemd[1]: Starting nova_compute container...
Jan 12 13:40:11 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:40:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/625856f06a0387e95b89f8380e9c08f27fbfa8edcb79a25b3f310ca7ed977c92/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:11 compute-0 podman[181965]: 2026-01-12 13:40:11.67594756 +0000 UTC m=+0.072459608 container init e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 12 13:40:11 compute-0 podman[181965]: 2026-01-12 13:40:11.681911183 +0000 UTC m=+0.078423210 container start e95d6543d47b30e55e5d1cbd3467be53071bea0a29d9f8c6f54bcc663c5215dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 12 13:40:11 compute-0 podman[181965]: nova_compute
Jan 12 13:40:11 compute-0 nova_compute[181978]: + sudo -E kolla_set_configs
Jan 12 13:40:11 compute-0 systemd[1]: Started nova_compute container.
Jan 12 13:40:11 compute-0 podman[181981]: 2026-01-12 13:40:11.708416423 +0000 UTC m=+0.041429370 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:40:11 compute-0 sudo[181916]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Validating config file
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Copying service configuration files
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Deleting /etc/ceph
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Creating directory /etc/ceph
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /etc/ceph
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Writing out command to execute
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 12 13:40:11 compute-0 nova_compute[181978]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 12 13:40:11 compute-0 nova_compute[181978]: ++ cat /run_command
Jan 12 13:40:11 compute-0 nova_compute[181978]: + CMD=nova-compute
Jan 12 13:40:11 compute-0 nova_compute[181978]: + ARGS=
Jan 12 13:40:11 compute-0 nova_compute[181978]: + sudo kolla_copy_cacerts
Jan 12 13:40:11 compute-0 nova_compute[181978]: + [[ ! -n '' ]]
Jan 12 13:40:11 compute-0 nova_compute[181978]: + . kolla_extend_start
Jan 12 13:40:11 compute-0 nova_compute[181978]: Running command: 'nova-compute'
Jan 12 13:40:11 compute-0 nova_compute[181978]: + echo 'Running command: '\''nova-compute'\'''
Jan 12 13:40:11 compute-0 nova_compute[181978]: + umask 0022
Jan 12 13:40:11 compute-0 nova_compute[181978]: + exec nova-compute
Jan 12 13:40:12 compute-0 sudo[182155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gchomdjxpdaffmuzmnlmaaeyvjxribgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225211.8483098-1287-236390526538609/AnsiballZ_podman_container.py'
Jan 12 13:40:12 compute-0 sudo[182155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:12 compute-0 python3.9[182157]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 12 13:40:12 compute-0 systemd[1]: Started libpod-conmon-951311c2e62a7f7883fd58074e02994a393f93afc4692fd03066c8d1a1db5134.scope.
Jan 12 13:40:12 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:40:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a3590e37b47937f160b8e6ba79ee62dc2f477eb749047ed6d2d7c2cb07785d2/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a3590e37b47937f160b8e6ba79ee62dc2f477eb749047ed6d2d7c2cb07785d2/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a3590e37b47937f160b8e6ba79ee62dc2f477eb749047ed6d2d7c2cb07785d2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:12 compute-0 podman[182177]: 2026-01-12 13:40:12.370693161 +0000 UTC m=+0.073915004 container init 951311c2e62a7f7883fd58074e02994a393f93afc4692fd03066c8d1a1db5134 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:40:12 compute-0 podman[182177]: 2026-01-12 13:40:12.375711209 +0000 UTC m=+0.078933021 container start 951311c2e62a7f7883fd58074e02994a393f93afc4692fd03066c8d1a1db5134 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init)
Jan 12 13:40:12 compute-0 python3.9[182157]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Applying nova statedir ownership
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 12 13:40:12 compute-0 nova_compute_init[182197]: INFO:nova_statedir:Nova statedir ownership complete
Jan 12 13:40:12 compute-0 systemd[1]: libpod-951311c2e62a7f7883fd58074e02994a393f93afc4692fd03066c8d1a1db5134.scope: Deactivated successfully.
Jan 12 13:40:12 compute-0 podman[182207]: 2026-01-12 13:40:12.445838996 +0000 UTC m=+0.019391649 container died 951311c2e62a7f7883fd58074e02994a393f93afc4692fd03066c8d1a1db5134 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 12 13:40:12 compute-0 podman[182207]: 2026-01-12 13:40:12.461506227 +0000 UTC m=+0.035058870 container cleanup 951311c2e62a7f7883fd58074e02994a393f93afc4692fd03066c8d1a1db5134 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251202)
Jan 12 13:40:12 compute-0 systemd[1]: libpod-conmon-951311c2e62a7f7883fd58074e02994a393f93afc4692fd03066c8d1a1db5134.scope: Deactivated successfully.
Jan 12 13:40:12 compute-0 sudo[182155]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:12 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 12 13:40:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-951311c2e62a7f7883fd58074e02994a393f93afc4692fd03066c8d1a1db5134-userdata-shm.mount: Deactivated successfully.
Jan 12 13:40:12 compute-0 sshd-session[159104]: Connection closed by 192.168.122.30 port 43470
Jan 12 13:40:12 compute-0 sshd-session[159101]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:40:12 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Jan 12 13:40:12 compute-0 systemd[1]: session-23.scope: Consumed 1min 5.436s CPU time.
Jan 12 13:40:12 compute-0 systemd-logind[775]: Session 23 logged out. Waiting for processes to exit.
Jan 12 13:40:12 compute-0 systemd-logind[775]: Removed session 23.
Jan 12 13:40:13 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.358 181991 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.358 181991 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.358 181991 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.358 181991 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.462 181991 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.471 181991 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.472 181991 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.845 181991 INFO nova.virt.driver [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.923 181991 INFO nova.compute.provider_config [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.934 181991 DEBUG oslo_concurrency.lockutils [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.934 181991 DEBUG oslo_concurrency.lockutils [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.934 181991 DEBUG oslo_concurrency.lockutils [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.934 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.935 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.935 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.935 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.935 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.935 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.935 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.935 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.936 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.936 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.936 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.936 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.936 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.936 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.936 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.937 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.937 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.937 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.937 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.937 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.937 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.937 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.938 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.938 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.938 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.938 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.938 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.938 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.938 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.939 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.939 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.939 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.939 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.939 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.939 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.939 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.940 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.940 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.940 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.940 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.940 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.940 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.941 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.941 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.941 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.941 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.941 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.941 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.942 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.942 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.942 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.942 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.942 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.942 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.942 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.943 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.943 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.943 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.943 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.943 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.943 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.943 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.944 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.944 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.944 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.944 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.944 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.944 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.944 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.944 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.945 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.945 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.945 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.945 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.945 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.945 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.946 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.946 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.946 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.946 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.946 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.946 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.946 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.946 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.947 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.947 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.947 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.947 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.947 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.947 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.947 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.948 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.948 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.948 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.948 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.948 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.948 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.948 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.949 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.949 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.949 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.949 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.949 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.949 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.949 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.950 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.950 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.950 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.950 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.950 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.950 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.950 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.950 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.951 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.951 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.951 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.951 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.951 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.951 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.951 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.952 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.952 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.952 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.952 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.952 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.952 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.952 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.953 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.953 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.953 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.953 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.953 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.953 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.953 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.953 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.954 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.954 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.954 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.954 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.954 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.954 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.954 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.955 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.955 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.955 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.955 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.955 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.955 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.955 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.956 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.956 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.956 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.956 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.956 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.956 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.956 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.957 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.957 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.957 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.957 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.957 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.957 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.957 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.958 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.958 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.958 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.958 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.958 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.958 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.958 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.959 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.959 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.959 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.959 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.959 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.959 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.959 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.959 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.960 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.960 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.960 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.960 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.960 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.960 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.960 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.961 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.961 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.961 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.961 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.961 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.961 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.961 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.962 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.962 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.962 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.962 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.962 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.962 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.962 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.963 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.963 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.963 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.963 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.963 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.963 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.963 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.964 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.964 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.964 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.964 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.964 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.964 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.964 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.964 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.965 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.965 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.965 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.965 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.965 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.965 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.965 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.966 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.966 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.966 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.966 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.966 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.966 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.966 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.967 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.967 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.967 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.967 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.967 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.967 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.967 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.968 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.968 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.968 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.968 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.968 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.968 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.968 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.969 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.969 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.969 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.969 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.969 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.969 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.969 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.970 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.970 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.970 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.970 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.970 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.970 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.970 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.970 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.971 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.971 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.971 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.971 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.971 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.971 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.971 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.972 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.972 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.972 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.972 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.972 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.972 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.972 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.973 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.973 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.973 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.973 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.973 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.973 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.973 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.974 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.974 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.974 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.974 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.974 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.974 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.974 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.975 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.975 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.975 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.975 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.975 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.975 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.975 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.976 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.976 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.976 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.976 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.976 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.976 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.976 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.976 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.977 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.977 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.977 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.977 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.977 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.977 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.977 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.978 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.978 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.978 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.978 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.978 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.978 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.978 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.979 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.979 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.979 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.979 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.979 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.979 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.979 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.980 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.980 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.980 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.980 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.980 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.980 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.980 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.980 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.981 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.981 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.981 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.981 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.981 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.981 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.981 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.982 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.982 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.982 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.982 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.982 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.982 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.982 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.983 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.983 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.983 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.983 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.983 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.983 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.983 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.984 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.984 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.984 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.984 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.984 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.984 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.985 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.985 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.985 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.985 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.985 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.985 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.985 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.986 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.986 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.986 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.986 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.986 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.986 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.986 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.987 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.987 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.987 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.987 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.987 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.987 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.987 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.988 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.988 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.988 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.988 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.988 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.988 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.988 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.989 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.989 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.989 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.989 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.989 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.989 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.989 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.990 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.990 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.990 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.990 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.990 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.990 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.990 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.991 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.991 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.991 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.991 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.991 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.991 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.991 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.992 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.992 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.992 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.992 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.992 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.992 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.992 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.992 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.993 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.993 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.993 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.993 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.993 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.993 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.993 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.994 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.994 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.994 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.994 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.994 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.994 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.994 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.994 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.995 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.995 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.995 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.995 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.995 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.995 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.995 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.996 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.996 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.996 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.996 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.996 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.996 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.996 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.997 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.997 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.997 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.997 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.997 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.997 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.997 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.997 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.998 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.998 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.998 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.998 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.998 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.998 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.999 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.999 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.999 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.999 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.999 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.999 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:13 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.999 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:13.999 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.000 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.000 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.000 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.000 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.000 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.000 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.000 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.001 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.001 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.001 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.001 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.001 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.001 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.001 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.002 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.002 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.002 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.002 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.002 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.002 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.002 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.003 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.003 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.003 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.003 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.003 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.003 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.003 181991 WARNING oslo_config.cfg [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 12 13:40:14 compute-0 nova_compute[181978]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 12 13:40:14 compute-0 nova_compute[181978]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 12 13:40:14 compute-0 nova_compute[181978]: and ``live_migration_inbound_addr`` respectively.
Jan 12 13:40:14 compute-0 nova_compute[181978]: ).  Its value may be silently ignored in the future.
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.004 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.004 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.004 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.004 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.004 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.004 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.005 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.005 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.005 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.005 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.005 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.005 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.005 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.006 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.006 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.006 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.006 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.006 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.006 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.007 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.007 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.007 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.007 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.007 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.007 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.007 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.008 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.008 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.008 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.008 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.008 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.008 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.008 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.009 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.009 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.009 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.009 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.009 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.009 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.009 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.010 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.010 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.010 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.010 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.010 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.010 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.010 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.011 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.011 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.011 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.011 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.011 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.011 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.011 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.012 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.012 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.012 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.012 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.012 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.012 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.012 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.013 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.013 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.013 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.013 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.013 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.013 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.013 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.014 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.014 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.014 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.014 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.014 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.014 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.014 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.014 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.015 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.015 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.015 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.015 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.015 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.015 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.015 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.016 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.016 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.016 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.016 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.016 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.016 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.016 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.017 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.017 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.017 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.017 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.017 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.017 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.017 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.018 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.018 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.018 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.018 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.018 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.018 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.018 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.019 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.019 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.019 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.019 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.019 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.019 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.019 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.019 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.020 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.020 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.020 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.020 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.020 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.020 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.020 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.021 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.021 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.021 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.021 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.021 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.021 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.021 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.022 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.022 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.022 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.022 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.022 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.022 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.022 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.023 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.023 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.023 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.023 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.023 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.023 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.023 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.024 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.024 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.024 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.024 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.024 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.024 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.025 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.025 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.025 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.025 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.025 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.025 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.025 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.026 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.026 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.026 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.026 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.026 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.026 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.026 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.027 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.027 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.027 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.027 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.027 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.027 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.027 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.028 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.028 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.028 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.028 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.028 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.028 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.028 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.028 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.029 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.029 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.029 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.029 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.029 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.029 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.030 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.030 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.030 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.030 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.030 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.030 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.030 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.031 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.031 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.031 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.031 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.031 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.031 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.031 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.031 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.032 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.032 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.032 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.032 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.032 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.032 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.033 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.033 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.033 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.033 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.033 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.033 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.033 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.034 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.034 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.034 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.034 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.034 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.034 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.034 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.034 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.035 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.035 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.035 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.035 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.035 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.035 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.035 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.036 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.036 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.036 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.036 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.036 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.036 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.036 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.037 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.037 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.037 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.037 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.037 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.037 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.037 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.037 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.038 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.038 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.038 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.038 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.038 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.038 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.038 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.039 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.039 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.039 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.039 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.039 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.039 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.040 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.040 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.040 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.040 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.040 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.040 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.041 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.041 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.041 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.041 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.041 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.041 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.041 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.041 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.042 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.042 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.042 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.042 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.042 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.042 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.042 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.043 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.043 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.043 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.043 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.043 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.043 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.043 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.044 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.044 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.044 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.044 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.044 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.044 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.044 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.045 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.045 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.045 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.045 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.045 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.045 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.045 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.046 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.046 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.046 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.046 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.046 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.046 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.046 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.047 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.047 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.047 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.047 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.047 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.047 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.047 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.048 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.048 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.048 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.048 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.048 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.048 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.048 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.049 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.049 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.049 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.049 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.049 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.049 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.049 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.050 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.050 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.050 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.050 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.050 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.050 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.050 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.051 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.051 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.051 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.051 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.051 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.051 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.051 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.052 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.052 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.052 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.052 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.052 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.052 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.052 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.053 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.053 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.053 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.053 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.053 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.053 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.053 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.053 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.054 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.054 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.054 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.054 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.054 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.054 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.054 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.055 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.055 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.055 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.055 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.055 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.055 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.055 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.055 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.056 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.056 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.056 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.056 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.056 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.056 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.056 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.057 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.057 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.057 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.057 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.057 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.057 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.057 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.058 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.058 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.058 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.058 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.058 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.058 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.058 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.059 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.059 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.059 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.059 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.059 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.059 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.059 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.060 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.060 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.060 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.060 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.060 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.060 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.060 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.061 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.061 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.061 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.061 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.061 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.061 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.061 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.062 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.062 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.062 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.062 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.062 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.062 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.062 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.062 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] privsep_osbrick.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.063 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.063 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.063 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.063 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.063 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.063 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] nova_sys_admin.thread_pool_size = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.063 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.064 181991 DEBUG oslo_service.service [None req-9c5c8650-8390-4341-a372-66831106f5eb - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.064 181991 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.075 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.076 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.076 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.076 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.085 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fc2dc24c4c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.087 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fc2dc24c4c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.087 181991 INFO nova.virt.libvirt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Connection event '1' reason 'None'
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.091 181991 INFO nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Libvirt host capabilities <capabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]: 
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <host>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <uuid>d52817b9-7ba9-47d6-a0e8-c5b94b158f91</uuid>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <arch>x86_64</arch>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model>EPYC-Milan-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <vendor>AMD</vendor>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <microcode version='167776725'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <signature family='25' model='1' stepping='1'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <topology sockets='4' dies='1' clusters='1' cores='1' threads='1'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <maxphysaddr mode='emulate' bits='48'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='x2apic'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='tsc-deadline'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='osxsave'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='hypervisor'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='tsc_adjust'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='ospke'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='vaes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='vpclmulqdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='spec-ctrl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='stibp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='arch-capabilities'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='cmp_legacy'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='virt-ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='lbrv'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='tsc-scale'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='vmcb-clean'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='pause-filter'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='pfthreshold'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='v-vmsave-vmload'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='vgif'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='rdctl-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='skip-l1dfl-vmentry'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='mds-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature name='pschange-mc-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <pages unit='KiB' size='4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <pages unit='KiB' size='2048'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <pages unit='KiB' size='1048576'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <power_management>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <suspend_mem/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <suspend_disk/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <suspend_hybrid/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </power_management>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <iommu support='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <migration_features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <live/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <uri_transports>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <uri_transport>tcp</uri_transport>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <uri_transport>rdma</uri_transport>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </uri_transports>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </migration_features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <topology>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <cells num='1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <cell id='0'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:           <memory unit='KiB'>7865368</memory>
Jan 12 13:40:14 compute-0 nova_compute[181978]:           <pages unit='KiB' size='4'>1966342</pages>
Jan 12 13:40:14 compute-0 nova_compute[181978]:           <pages unit='KiB' size='2048'>0</pages>
Jan 12 13:40:14 compute-0 nova_compute[181978]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 12 13:40:14 compute-0 nova_compute[181978]:           <distances>
Jan 12 13:40:14 compute-0 nova_compute[181978]:             <sibling id='0' value='10'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:           </distances>
Jan 12 13:40:14 compute-0 nova_compute[181978]:           <cpus num='4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:           </cpus>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         </cell>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </cells>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </topology>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <cache>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </cache>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <secmodel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model>selinux</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <doi>0</doi>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </secmodel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <secmodel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model>dac</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <doi>0</doi>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </secmodel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </host>
Jan 12 13:40:14 compute-0 nova_compute[181978]: 
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <guest>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <os_type>hvm</os_type>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <arch name='i686'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <wordsize>32</wordsize>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <domain type='qemu'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <domain type='kvm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </arch>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <pae/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <nonpae/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <acpi default='on' toggle='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <apic default='on' toggle='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <cpuselection/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <deviceboot/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <disksnapshot default='on' toggle='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <externalSnapshot/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </guest>
Jan 12 13:40:14 compute-0 nova_compute[181978]: 
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <guest>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <os_type>hvm</os_type>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <arch name='x86_64'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <wordsize>64</wordsize>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <domain type='qemu'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <domain type='kvm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </arch>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <acpi default='on' toggle='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <apic default='on' toggle='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <cpuselection/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <deviceboot/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <disksnapshot default='on' toggle='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <externalSnapshot/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </guest>
Jan 12 13:40:14 compute-0 nova_compute[181978]: 
Jan 12 13:40:14 compute-0 nova_compute[181978]: </capabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]: 
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.098 181991 WARNING nova.virt.libvirt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.098 181991 DEBUG nova.virt.libvirt.volume.mount [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.103 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.119 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 12 13:40:14 compute-0 nova_compute[181978]: <domainCapabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <path>/usr/libexec/qemu-kvm</path>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <domain>kvm</domain>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <arch>i686</arch>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <vcpu max='4096'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <iothreads supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <os supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <enum name='firmware'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <loader supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>rom</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pflash</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='readonly'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>yes</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>no</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='secure'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>no</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </loader>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </os>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='host-passthrough' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='hostPassthroughMigratable'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>on</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>off</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='maximum' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='maximumMigratable'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>on</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>off</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='host-model' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <vendor>AMD</vendor>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='x2apic'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc-deadline'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='hypervisor'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc_adjust'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vaes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='spec-ctrl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='stibp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='cmp_legacy'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='overflow-recov'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='succor'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='virt-ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='lbrv'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc-scale'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vmcb-clean'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='flushbyasid'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='pause-filter'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='pfthreshold'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vgif'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='custom' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Denverton'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Denverton-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Genoa'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='auto-ibrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Genoa-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='auto-ibrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Milan-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-128'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-256'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-512'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-noTSX'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v6'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v7'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='KnightsMill'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4fmaps'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4vnniw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512er'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512pf'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='KnightsMill-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4fmaps'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4vnniw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512er'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512pf'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G4-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tbm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G5-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tbm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SierraForest'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ne-convert'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cmpccxadd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SierraForest-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ne-convert'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cmpccxadd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='athlon'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='athlon-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='core2duo'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='core2duo-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='coreduo'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='coreduo-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='n270'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='n270-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='phenom'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='phenom-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <memoryBacking supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <enum name='sourceType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>file</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>anonymous</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>memfd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </memoryBacking>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <disk supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='diskDevice'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>disk</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>cdrom</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>floppy</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>lun</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='bus'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>fdc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>scsi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>sata</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-non-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <graphics supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vnc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>egl-headless</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dbus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <video supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='modelType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vga</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>cirrus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>none</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>bochs</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ramfb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </video>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <hostdev supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='mode'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>subsystem</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='startupPolicy'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>default</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>mandatory</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>requisite</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>optional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='subsysType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pci</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>scsi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='capsType'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='pciBackend'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </hostdev>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <rng supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-non-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>random</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>egd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>builtin</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <filesystem supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='driverType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>path</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>handle</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtiofs</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </filesystem>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <tpm supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tpm-tis</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tpm-crb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>emulator</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>external</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendVersion'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>2.0</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </tpm>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <redirdev supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='bus'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </redirdev>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <channel supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pty</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>unix</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </channel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <crypto supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>qemu</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>builtin</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </crypto>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <interface supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>default</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>passt</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <panic supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>isa</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>hyperv</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </panic>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <console supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>null</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pty</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dev</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>file</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pipe</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>stdio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>udp</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tcp</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>unix</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>qemu-vdagent</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dbus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </console>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <gic supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <vmcoreinfo supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <genid supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <backingStoreInput supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <backup supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <async-teardown supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <ps2 supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <sev supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <sgx supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <hyperv supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='features'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>relaxed</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vapic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>spinlocks</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vpindex</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>runtime</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>synic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>stimer</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>reset</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vendor_id</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>frequencies</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>reenlightenment</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tlbflush</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ipi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>avic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>emsr_bitmap</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>xmm_input</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <defaults>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <spinlocks>4095</spinlocks>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <stimer_direct>on</stimer_direct>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <tlbflush_direct>on</tlbflush_direct>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <tlbflush_extended>on</tlbflush_extended>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </defaults>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </hyperv>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <launchSecurity supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='sectype'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tdx</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </launchSecurity>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </features>
Jan 12 13:40:14 compute-0 nova_compute[181978]: </domainCapabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.122 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 12 13:40:14 compute-0 nova_compute[181978]: <domainCapabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <path>/usr/libexec/qemu-kvm</path>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <domain>kvm</domain>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <arch>i686</arch>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <vcpu max='240'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <iothreads supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <os supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <enum name='firmware'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <loader supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>rom</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pflash</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='readonly'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>yes</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>no</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='secure'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>no</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </loader>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </os>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='host-passthrough' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='hostPassthroughMigratable'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>on</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>off</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='maximum' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='maximumMigratable'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>on</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>off</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='host-model' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <vendor>AMD</vendor>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='x2apic'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc-deadline'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='hypervisor'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc_adjust'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vaes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='spec-ctrl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='stibp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='cmp_legacy'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='overflow-recov'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='succor'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='virt-ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='lbrv'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc-scale'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vmcb-clean'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='flushbyasid'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='pause-filter'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='pfthreshold'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vgif'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='custom' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Denverton'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Denverton-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Genoa'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='auto-ibrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Genoa-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='auto-ibrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Milan-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-128'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-256'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-512'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-noTSX'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v6'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v7'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='KnightsMill'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4fmaps'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4vnniw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512er'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512pf'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='KnightsMill-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4fmaps'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4vnniw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512er'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512pf'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G4-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tbm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G5-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tbm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SierraForest'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ne-convert'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cmpccxadd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SierraForest-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ne-convert'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cmpccxadd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='athlon'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='athlon-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='core2duo'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='core2duo-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='coreduo'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='coreduo-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='n270'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='n270-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='phenom'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='phenom-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <memoryBacking supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <enum name='sourceType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>file</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>anonymous</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>memfd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </memoryBacking>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <disk supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='diskDevice'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>disk</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>cdrom</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>floppy</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>lun</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='bus'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ide</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>fdc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>scsi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>sata</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-non-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <graphics supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vnc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>egl-headless</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dbus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <video supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='modelType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vga</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>cirrus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>none</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>bochs</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ramfb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </video>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <hostdev supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='mode'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>subsystem</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='startupPolicy'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>default</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>mandatory</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>requisite</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>optional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='subsysType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pci</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>scsi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='capsType'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='pciBackend'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </hostdev>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <rng supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-non-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>random</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>egd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>builtin</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <filesystem supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='driverType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>path</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>handle</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtiofs</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </filesystem>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <tpm supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tpm-tis</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tpm-crb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>emulator</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>external</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendVersion'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>2.0</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </tpm>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <redirdev supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='bus'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </redirdev>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <channel supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pty</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>unix</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </channel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <crypto supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>qemu</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>builtin</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </crypto>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <interface supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>default</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>passt</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <panic supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>isa</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>hyperv</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </panic>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <console supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>null</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pty</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dev</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>file</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pipe</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>stdio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>udp</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tcp</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>unix</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>qemu-vdagent</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dbus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </console>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <gic supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <vmcoreinfo supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <genid supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <backingStoreInput supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <backup supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <async-teardown supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <ps2 supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <sev supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <sgx supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <hyperv supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='features'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>relaxed</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vapic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>spinlocks</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vpindex</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>runtime</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>synic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>stimer</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>reset</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vendor_id</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>frequencies</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>reenlightenment</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tlbflush</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ipi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>avic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>emsr_bitmap</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>xmm_input</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <defaults>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <spinlocks>4095</spinlocks>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <stimer_direct>on</stimer_direct>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <tlbflush_direct>on</tlbflush_direct>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <tlbflush_extended>on</tlbflush_extended>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </defaults>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </hyperv>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <launchSecurity supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='sectype'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tdx</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </launchSecurity>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </features>
Jan 12 13:40:14 compute-0 nova_compute[181978]: </domainCapabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.123 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.126 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 12 13:40:14 compute-0 nova_compute[181978]: <domainCapabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <path>/usr/libexec/qemu-kvm</path>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <domain>kvm</domain>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <arch>x86_64</arch>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <vcpu max='4096'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <iothreads supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <os supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <enum name='firmware'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>efi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <loader supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>rom</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pflash</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='readonly'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>yes</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>no</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='secure'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>yes</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>no</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </loader>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </os>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='host-passthrough' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='hostPassthroughMigratable'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>on</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>off</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='maximum' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='maximumMigratable'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>on</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>off</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='host-model' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <vendor>AMD</vendor>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='x2apic'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc-deadline'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='hypervisor'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc_adjust'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vaes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='spec-ctrl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='stibp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='cmp_legacy'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='overflow-recov'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='succor'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='virt-ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='lbrv'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc-scale'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vmcb-clean'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='flushbyasid'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='pause-filter'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='pfthreshold'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vgif'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='custom' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Denverton'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Denverton-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Genoa'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='auto-ibrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Genoa-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='auto-ibrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Milan-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-128'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-256'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-512'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-noTSX'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v6'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v7'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='KnightsMill'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4fmaps'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4vnniw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512er'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512pf'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='KnightsMill-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4fmaps'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4vnniw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512er'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512pf'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G4-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tbm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G5-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tbm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SierraForest'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ne-convert'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cmpccxadd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SierraForest-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ne-convert'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cmpccxadd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='athlon'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='athlon-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='core2duo'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='core2duo-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='coreduo'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='coreduo-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='n270'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='n270-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='phenom'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='phenom-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <memoryBacking supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <enum name='sourceType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>file</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>anonymous</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>memfd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </memoryBacking>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <disk supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='diskDevice'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>disk</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>cdrom</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>floppy</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>lun</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='bus'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>fdc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>scsi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>sata</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-non-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <graphics supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vnc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>egl-headless</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dbus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <video supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='modelType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vga</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>cirrus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>none</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>bochs</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ramfb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </video>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <hostdev supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='mode'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>subsystem</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='startupPolicy'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>default</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>mandatory</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>requisite</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>optional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='subsysType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pci</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>scsi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='capsType'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='pciBackend'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </hostdev>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <rng supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-non-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>random</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>egd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>builtin</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <filesystem supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='driverType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>path</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>handle</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtiofs</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </filesystem>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <tpm supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tpm-tis</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tpm-crb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>emulator</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>external</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendVersion'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>2.0</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </tpm>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <redirdev supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='bus'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </redirdev>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <channel supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pty</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>unix</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </channel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <crypto supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>qemu</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>builtin</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </crypto>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <interface supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>default</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>passt</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <panic supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>isa</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>hyperv</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </panic>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <console supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>null</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pty</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dev</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>file</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pipe</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>stdio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>udp</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tcp</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>unix</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>qemu-vdagent</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dbus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </console>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <gic supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <vmcoreinfo supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <genid supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <backingStoreInput supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <backup supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <async-teardown supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <ps2 supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <sev supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <sgx supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <hyperv supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='features'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>relaxed</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vapic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>spinlocks</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vpindex</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>runtime</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>synic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>stimer</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>reset</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vendor_id</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>frequencies</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>reenlightenment</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tlbflush</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ipi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>avic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>emsr_bitmap</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>xmm_input</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <defaults>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <spinlocks>4095</spinlocks>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <stimer_direct>on</stimer_direct>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <tlbflush_direct>on</tlbflush_direct>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <tlbflush_extended>on</tlbflush_extended>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </defaults>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </hyperv>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <launchSecurity supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='sectype'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tdx</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </launchSecurity>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </features>
Jan 12 13:40:14 compute-0 nova_compute[181978]: </domainCapabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.173 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 12 13:40:14 compute-0 nova_compute[181978]: <domainCapabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <path>/usr/libexec/qemu-kvm</path>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <domain>kvm</domain>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <arch>x86_64</arch>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <vcpu max='240'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <iothreads supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <os supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <enum name='firmware'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <loader supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>rom</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pflash</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='readonly'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>yes</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>no</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='secure'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>no</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </loader>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </os>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='host-passthrough' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='hostPassthroughMigratable'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>on</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>off</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='maximum' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='maximumMigratable'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>on</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>off</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='host-model' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <vendor>AMD</vendor>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <maxphysaddr mode='passthrough' limit='48'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='x2apic'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc-deadline'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='hypervisor'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc_adjust'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vaes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='spec-ctrl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='stibp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='cmp_legacy'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='overflow-recov'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='succor'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='virt-ssbd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='lbrv'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='tsc-scale'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vmcb-clean'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='flushbyasid'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='pause-filter'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='pfthreshold'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='v-vmsave-vmload'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='vgif'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <mode name='custom' supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Broadwell-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cascadelake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Cooperlake-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Denverton'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Denverton-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Genoa'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='auto-ibrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Genoa-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='auto-ibrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='EPYC-Milan-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amd-psfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='no-nested-data-bp'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='null-sel-clr-base'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='stibp-always-on'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='GraniteRapids-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-128'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-256'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx10-512'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='prefetchiti'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Haswell-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-noTSX'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v6'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Icelake-Server-v7'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='KnightsMill'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4fmaps'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4vnniw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512er'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512pf'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='KnightsMill-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4fmaps'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-4vnniw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512er'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512pf'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G4-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tbm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Opteron_G5-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fma4'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tbm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xop'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SapphireRapids-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='amx-tile'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-bf16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-fp16'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512-vpopcntdq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bitalg'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vbmi2'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrc'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fzrm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='la57'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='taa-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='tsx-ldtrk'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='xfd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SierraForest'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ne-convert'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cmpccxadd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='SierraForest-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ifma'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-ne-convert'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx-vnni-int8'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='bus-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cmpccxadd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fbsdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='fsrs'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ibrs-all'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mcdt-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='pbrsb-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='psdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='sbdr-ssdp-no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='serialize'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Client-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='hle'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='rtm'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Skylake-Server-v5'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512bw'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512cd'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512dq'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512f'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='avx512vl'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='mpx'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v2'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v3'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='core-capability'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='split-lock-detect'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='Snowridge-v4'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='cldemote'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='gfni'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdir64b'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='movdiri'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='athlon'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='athlon-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='core2duo'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='core2duo-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='coreduo'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='coreduo-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='n270'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='n270-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='ss'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='phenom'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <blockers model='phenom-v1'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnow'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <feature name='3dnowext'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </blockers>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </mode>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <memoryBacking supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <enum name='sourceType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>file</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>anonymous</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <value>memfd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </memoryBacking>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <disk supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='diskDevice'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>disk</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>cdrom</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>floppy</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>lun</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='bus'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ide</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>fdc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>scsi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>sata</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-non-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <graphics supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vnc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>egl-headless</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dbus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <video supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='modelType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vga</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>cirrus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>none</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>bochs</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ramfb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </video>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <hostdev supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='mode'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>subsystem</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='startupPolicy'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>default</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>mandatory</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>requisite</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>optional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='subsysType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pci</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>scsi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='capsType'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='pciBackend'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </hostdev>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <rng supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtio-non-transitional</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>random</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>egd</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>builtin</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <filesystem supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='driverType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>path</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>handle</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>virtiofs</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </filesystem>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <tpm supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tpm-tis</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tpm-crb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>emulator</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>external</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendVersion'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>2.0</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </tpm>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <redirdev supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='bus'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>usb</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </redirdev>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <channel supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pty</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>unix</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </channel>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <crypto supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>qemu</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendModel'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>builtin</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </crypto>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <interface supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='backendType'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>default</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>passt</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <panic supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='model'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>isa</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>hyperv</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </panic>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <console supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='type'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>null</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vc</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pty</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dev</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>file</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>pipe</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>stdio</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>udp</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tcp</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>unix</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>qemu-vdagent</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>dbus</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </console>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   <features>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <gic supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <vmcoreinfo supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <genid supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <backingStoreInput supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <backup supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <async-teardown supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <ps2 supported='yes'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <sev supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <sgx supported='no'/>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <hyperv supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='features'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>relaxed</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vapic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>spinlocks</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vpindex</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>runtime</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>synic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>stimer</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>reset</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>vendor_id</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>frequencies</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>reenlightenment</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tlbflush</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>ipi</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>avic</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>emsr_bitmap</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>xmm_input</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <defaults>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <spinlocks>4095</spinlocks>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <stimer_direct>on</stimer_direct>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <tlbflush_direct>on</tlbflush_direct>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <tlbflush_extended>on</tlbflush_extended>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </defaults>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </hyperv>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     <launchSecurity supported='yes'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       <enum name='sectype'>
Jan 12 13:40:14 compute-0 nova_compute[181978]:         <value>tdx</value>
Jan 12 13:40:14 compute-0 nova_compute[181978]:       </enum>
Jan 12 13:40:14 compute-0 nova_compute[181978]:     </launchSecurity>
Jan 12 13:40:14 compute-0 nova_compute[181978]:   </features>
Jan 12 13:40:14 compute-0 nova_compute[181978]: </domainCapabilities>
Jan 12 13:40:14 compute-0 nova_compute[181978]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.216 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.216 181991 INFO nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Secure Boot support detected
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.217 181991 INFO nova.virt.libvirt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.217 181991 INFO nova.virt.libvirt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.223 181991 DEBUG nova.virt.libvirt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.254 181991 INFO nova.virt.node [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Determined node identity 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 from /var/lib/nova/compute_id
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.268 181991 WARNING nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Compute nodes ['5f3fe3a8-f640-4221-8f9a-71aa07eebe17'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.310 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.506 181991 WARNING nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.507 181991 DEBUG oslo_concurrency.lockutils [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.507 181991 DEBUG oslo_concurrency.lockutils [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.507 181991 DEBUG oslo_concurrency.lockutils [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.507 181991 DEBUG nova.compute.resource_tracker [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:40:14 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 12 13:40:14 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.835 181991 WARNING nova.virt.libvirt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.836 181991 DEBUG nova.compute.resource_tracker [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6129MB free_disk=73.5858039855957GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.837 181991 DEBUG oslo_concurrency.lockutils [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.837 181991 DEBUG oslo_concurrency.lockutils [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.848 181991 WARNING nova.compute.resource_tracker [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] No compute node record for compute-0.ctlplane.example.com:5f3fe3a8-f640-4221-8f9a-71aa07eebe17: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 could not be found.
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.860 181991 INFO nova.compute.resource_tracker [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.897 181991 DEBUG nova.compute.resource_tracker [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:40:14 compute-0 nova_compute[181978]: 2026-01-12 13:40:14.898 181991 DEBUG nova.compute.resource_tracker [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:40:14 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 12 13:40:15 compute-0 nova_compute[181978]: 2026-01-12 13:40:15.623 181991 INFO nova.scheduler.client.report [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [req-b29080d7-610c-42e4-a81a-b16529491f57] Created resource provider record via placement API for resource provider with UUID 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 and name compute-0.ctlplane.example.com.
Jan 12 13:40:15 compute-0 nova_compute[181978]: 2026-01-12 13:40:15.990 181991 DEBUG nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 12 13:40:15 compute-0 nova_compute[181978]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 12 13:40:15 compute-0 nova_compute[181978]: 2026-01-12 13:40:15.990 181991 INFO nova.virt.libvirt.host [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] kernel doesn't support AMD SEV
Jan 12 13:40:15 compute-0 nova_compute[181978]: 2026-01-12 13:40:15.990 181991 DEBUG nova.compute.provider_tree [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Updating inventory in ProviderTree for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 12 13:40:15 compute-0 nova_compute[181978]: 2026-01-12 13:40:15.991 181991 DEBUG nova.virt.libvirt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:40:16 compute-0 nova_compute[181978]: 2026-01-12 13:40:16.030 181991 DEBUG nova.scheduler.client.report [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Updated inventory for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 12 13:40:16 compute-0 nova_compute[181978]: 2026-01-12 13:40:16.030 181991 DEBUG nova.compute.provider_tree [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Updating resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 12 13:40:16 compute-0 nova_compute[181978]: 2026-01-12 13:40:16.031 181991 DEBUG nova.compute.provider_tree [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Updating inventory in ProviderTree for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 12 13:40:16 compute-0 nova_compute[181978]: 2026-01-12 13:40:16.120 181991 DEBUG nova.compute.provider_tree [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Updating resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 12 13:40:16 compute-0 nova_compute[181978]: 2026-01-12 13:40:16.146 181991 DEBUG nova.compute.resource_tracker [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:40:16 compute-0 nova_compute[181978]: 2026-01-12 13:40:16.146 181991 DEBUG oslo_concurrency.lockutils [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:40:16 compute-0 nova_compute[181978]: 2026-01-12 13:40:16.146 181991 DEBUG nova.service [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 12 13:40:16 compute-0 nova_compute[181978]: 2026-01-12 13:40:16.187 181991 DEBUG nova.service [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 12 13:40:16 compute-0 nova_compute[181978]: 2026-01-12 13:40:16.187 181991 DEBUG nova.servicegroup.drivers.db [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 12 13:40:17 compute-0 sshd-session[182307]: Accepted publickey for zuul from 192.168.122.30 port 34696 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:40:17 compute-0 systemd-logind[775]: New session 25 of user zuul.
Jan 12 13:40:17 compute-0 systemd[1]: Started Session 25 of User zuul.
Jan 12 13:40:17 compute-0 sshd-session[182307]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:40:18 compute-0 python3.9[182460]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 12 13:40:19 compute-0 sudo[182614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjywjaukshlumwrhqsangbuppbcnexmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225219.1534576-31-252013082158537/AnsiballZ_systemd_service.py'
Jan 12 13:40:19 compute-0 sudo[182614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:19 compute-0 python3.9[182616]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:40:19 compute-0 systemd[1]: Reloading.
Jan 12 13:40:19 compute-0 systemd-sysv-generator[182639]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:40:19 compute-0 systemd-rc-local-generator[182636]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:40:20 compute-0 sudo[182614]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:20 compute-0 python3.9[182800]: ansible-ansible.builtin.service_facts Invoked
Jan 12 13:40:20 compute-0 network[182817]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 12 13:40:20 compute-0 network[182818]: 'network-scripts' will be removed from distribution in near future.
Jan 12 13:40:20 compute-0 network[182819]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 12 13:40:22 compute-0 sudo[183089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbjznqbvanefqgmgunblkxbovbhujngi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225222.6747851-50-153357436509058/AnsiballZ_systemd_service.py'
Jan 12 13:40:22 compute-0 sudo[183089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:23 compute-0 python3.9[183091]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:40:23 compute-0 sudo[183089]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:23 compute-0 sudo[183242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgqpmkuxazurpxaflxxjxwicnmkgxucm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225223.4824345-60-266687244076251/AnsiballZ_file.py'
Jan 12 13:40:23 compute-0 sudo[183242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:23 compute-0 python3.9[183244]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:23 compute-0 sudo[183242]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:23 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:40:23 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:40:23 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:40:24 compute-0 sudo[183395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdosemdxdxhwjdkzwkxiqsqilytkmwoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225224.088331-68-231681851572532/AnsiballZ_file.py'
Jan 12 13:40:24 compute-0 sudo[183395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:24 compute-0 python3.9[183397]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:24 compute-0 sudo[183395]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:24 compute-0 sudo[183547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqfnpcwthjumxdfcllmznpltsnhdehya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225224.5685344-77-227855151386156/AnsiballZ_command.py'
Jan 12 13:40:24 compute-0 sudo[183547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:24 compute-0 python3.9[183549]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:40:25 compute-0 sudo[183547]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:25 compute-0 python3.9[183701]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 12 13:40:25 compute-0 sudo[183851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neoippqgmuvxdwizivlzpsyplvnlmdga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225225.7126071-95-130486781228805/AnsiballZ_systemd_service.py'
Jan 12 13:40:25 compute-0 sudo[183851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:26 compute-0 python3.9[183853]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:40:26 compute-0 systemd[1]: Reloading.
Jan 12 13:40:26 compute-0 systemd-sysv-generator[183878]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:40:26 compute-0 systemd-rc-local-generator[183875]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:40:26 compute-0 sudo[183851]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:26 compute-0 sudo[184039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lynikrbricfnkcnxbdswrpddmctxtmdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225226.4686522-103-16735751941299/AnsiballZ_command.py'
Jan 12 13:40:26 compute-0 sudo[184039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:26 compute-0 python3.9[184041]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:40:26 compute-0 sudo[184039]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:27 compute-0 sudo[184192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iktuykrtxqoshgkuaqarwanzzkzaxcdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225226.9822228-112-264605377076152/AnsiballZ_file.py'
Jan 12 13:40:27 compute-0 sudo[184192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:27 compute-0 python3.9[184194]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:27 compute-0 sudo[184192]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:27 compute-0 python3.9[184344]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:28 compute-0 sudo[184496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imtcucxezjafjtwrcsgejimppwrjpnqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225228.0823557-128-110827745625490/AnsiballZ_group.py'
Jan 12 13:40:28 compute-0 sudo[184496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:28 compute-0 python3.9[184498]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 12 13:40:28 compute-0 sudo[184496]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:29 compute-0 sudo[184648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhrksvvtppkbjvfhdvdantlmzmrtjoeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225228.7808077-139-61832522889702/AnsiballZ_getent.py'
Jan 12 13:40:29 compute-0 sudo[184648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:29 compute-0 python3.9[184650]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 12 13:40:29 compute-0 sudo[184648]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:29 compute-0 sudo[184801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjssptoxgfvpualnrtbzptjcoyleyeev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225229.3380573-147-184547806975298/AnsiballZ_group.py'
Jan 12 13:40:29 compute-0 sudo[184801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:29 compute-0 python3.9[184803]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 12 13:40:29 compute-0 groupadd[184804]: group added to /etc/group: name=ceilometer, GID=42405
Jan 12 13:40:29 compute-0 groupadd[184804]: group added to /etc/gshadow: name=ceilometer
Jan 12 13:40:29 compute-0 groupadd[184804]: new group: name=ceilometer, GID=42405
Jan 12 13:40:29 compute-0 sudo[184801]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:30 compute-0 sudo[184959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmsxxidwfbwcxtrujcgwkzxvecowpvxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225229.8167386-155-262315404998469/AnsiballZ_user.py'
Jan 12 13:40:30 compute-0 sudo[184959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:30 compute-0 python3.9[184961]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 12 13:40:30 compute-0 useradd[184963]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 12 13:40:30 compute-0 useradd[184963]: add 'ceilometer' to group 'libvirt'
Jan 12 13:40:30 compute-0 useradd[184963]: add 'ceilometer' to shadow group 'libvirt'
Jan 12 13:40:30 compute-0 sudo[184959]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:31 compute-0 python3.9[185119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:31 compute-0 python3.9[185240]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1768225230.9380496-181-200807296589577/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:32 compute-0 python3.9[185390]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:32 compute-0 python3.9[185511]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1768225231.865467-181-270894336998843/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:32 compute-0 python3.9[185661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:33 compute-0 python3.9[185782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1768225232.6505668-181-37160185799136/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:33 compute-0 podman[185783]: 2026-01-12 13:40:33.394396399 +0000 UTC m=+0.060379134 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 12 13:40:33 compute-0 python3.9[185956]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:34 compute-0 python3.9[186108]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:34 compute-0 python3.9[186260]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:35 compute-0 python3.9[186381]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225234.3514087-240-5771873930472/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:35 compute-0 python3.9[186531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:35 compute-0 python3.9[186652]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225235.13265-240-60550597251487/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:36 compute-0 python3.9[186802]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:36 compute-0 python3.9[186923]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225235.961848-269-218448110247379/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:37 compute-0 python3.9[187073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:37 compute-0 python3.9[187194]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225236.8358495-285-125321642715134/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:38 compute-0 python3.9[187344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:38 compute-0 python3.9[187465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225237.7677908-300-255719723668851/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:38 compute-0 python3.9[187615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:39 compute-0 python3.9[187736]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225238.573883-315-94420776123390/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:39 compute-0 sudo[187886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urodztgiruwtouojcrezzgjefucsrwbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225239.3608482-330-160622032734055/AnsiballZ_file.py'
Jan 12 13:40:39 compute-0 sudo[187886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:39 compute-0 python3.9[187888]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:39 compute-0 sudo[187886]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:39 compute-0 sudo[188038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwqgmoxipuylrnlhbbfjseamucqsuall ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225239.8172266-338-76045899548305/AnsiballZ_file.py'
Jan 12 13:40:39 compute-0 sudo[188038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:40 compute-0 python3.9[188040]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:40 compute-0 sudo[188038]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:40:40.193 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:40:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:40:40.193 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:40:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:40:40.193 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:40:40 compute-0 python3.9[188190]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:41 compute-0 python3.9[188342]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:41 compute-0 python3.9[188494]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:41 compute-0 sudo[188659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poxvaoooofnzusdnbfnwhzwuipiwatzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225241.636481-370-156263690725734/AnsiballZ_file.py'
Jan 12 13:40:41 compute-0 sudo[188659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:41 compute-0 podman[188620]: 2026-01-12 13:40:41.995393689 +0000 UTC m=+0.067222988 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 12 13:40:42 compute-0 python3.9[188665]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:42 compute-0 sudo[188659]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:42 compute-0 sudo[188816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhdqnvjybrmezwykxasazjgrdgezjeso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225242.24528-378-204963810531258/AnsiballZ_systemd_service.py'
Jan 12 13:40:42 compute-0 sudo[188816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:42 compute-0 python3.9[188818]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:40:42 compute-0 systemd[1]: Reloading.
Jan 12 13:40:42 compute-0 systemd-sysv-generator[188844]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:40:42 compute-0 systemd-rc-local-generator[188841]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:40:42 compute-0 systemd[1]: Listening on Podman API Socket.
Jan 12 13:40:42 compute-0 sudo[188816]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:43 compute-0 nova_compute[181978]: 2026-01-12 13:40:43.189 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:40:43 compute-0 nova_compute[181978]: 2026-01-12 13:40:43.203 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:40:43 compute-0 sudo[189007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcspecdjjjeyjqzmahrciqptdvktsygv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225243.1806371-387-173935403467292/AnsiballZ_stat.py'
Jan 12 13:40:43 compute-0 sudo[189007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:43 compute-0 python3.9[189009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:43 compute-0 sudo[189007]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:43 compute-0 sudo[189130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycilanrrtqzyrpbvrrsnwlwuzoizumjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225243.1806371-387-173935403467292/AnsiballZ_copy.py'
Jan 12 13:40:43 compute-0 sudo[189130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:43 compute-0 python3.9[189132]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225243.1806371-387-173935403467292/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:43 compute-0 sudo[189130]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:44 compute-0 sudo[189206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xodarwavxkoijlofgfhwchmwgwrxokwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225243.1806371-387-173935403467292/AnsiballZ_stat.py'
Jan 12 13:40:44 compute-0 sudo[189206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:44 compute-0 python3.9[189208]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:44 compute-0 sudo[189206]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:44 compute-0 sudo[189329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcnfpqkwgqpwcepoecdjjlnlutgfgxfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225243.1806371-387-173935403467292/AnsiballZ_copy.py'
Jan 12 13:40:44 compute-0 sudo[189329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:44 compute-0 python3.9[189331]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225243.1806371-387-173935403467292/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:44 compute-0 sudo[189329]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:45 compute-0 sudo[189481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybttuliwmxlzrulidgzvhvekimlqqjrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225245.0408876-419-232639386164349/AnsiballZ_file.py'
Jan 12 13:40:45 compute-0 sudo[189481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:45 compute-0 python3.9[189483]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:45 compute-0 sudo[189481]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:45 compute-0 sudo[189633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrbqoadowwmmqozuxbblwkosqzbrmglr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225245.5372624-427-54769939401712/AnsiballZ_file.py'
Jan 12 13:40:45 compute-0 sudo[189633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:45 compute-0 python3.9[189635]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:45 compute-0 sudo[189633]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:46 compute-0 sudo[189785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsqogezvrfjgnwgbiguydnbnwvutjxja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225246.0853903-435-4015442621127/AnsiballZ_stat.py'
Jan 12 13:40:46 compute-0 sudo[189785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:46 compute-0 python3.9[189787]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:46 compute-0 sudo[189785]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:46 compute-0 sudo[189908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jshkufmzdveqmtxpuooxpwpozrdkzefj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225246.0853903-435-4015442621127/AnsiballZ_copy.py'
Jan 12 13:40:46 compute-0 sudo[189908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:46 compute-0 python3.9[189910]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225246.0853903-435-4015442621127/.source.json _original_basename=.piz28f0y follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:46 compute-0 sudo[189908]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:47 compute-0 python3.9[190060]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:48 compute-0 sudo[190481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztiseccbodazljsxmpaujdmblcaytlbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225248.4972138-475-23406366965159/AnsiballZ_container_config_data.py'
Jan 12 13:40:48 compute-0 sudo[190481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:48 compute-0 python3.9[190483]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 12 13:40:48 compute-0 sudo[190481]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:49 compute-0 sudo[190633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unyjbiebqikndsxkawubvteoyotbzulp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225249.2268214-486-272304360465496/AnsiballZ_container_config_hash.py'
Jan 12 13:40:49 compute-0 sudo[190633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:49 compute-0 python3.9[190635]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 12 13:40:49 compute-0 sudo[190633]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:50 compute-0 sudo[190785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgfouxijbjuptzgzmqadkwwwdqfqlgws ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768225249.9206884-496-149834146794976/AnsiballZ_edpm_container_manage.py'
Jan 12 13:40:50 compute-0 sudo[190785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:50 compute-0 python3[190787]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 12 13:40:50 compute-0 podman[190817]: 2026-01-12 13:40:50.594551337 +0000 UTC m=+0.028964051 container create 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 12 13:40:50 compute-0 podman[190817]: 2026-01-12 13:40:50.580797035 +0000 UTC m=+0.015209769 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf
Jan 12 13:40:50 compute-0 python3[190787]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf kolla_start
Jan 12 13:40:50 compute-0 sudo[190785]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:50 compute-0 sudo[190993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnxyzynsngclvbujmfamywgeidmrbshs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225250.8027842-504-68326479706768/AnsiballZ_stat.py'
Jan 12 13:40:50 compute-0 sudo[190993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:51 compute-0 python3.9[190995]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:51 compute-0 sudo[190993]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:51 compute-0 sudo[191147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzkmvxgvxuuihjqtmzdhyideekwapfnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225251.3523815-513-98285451547668/AnsiballZ_file.py'
Jan 12 13:40:51 compute-0 sudo[191147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:51 compute-0 python3.9[191149]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:51 compute-0 sudo[191147]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:51 compute-0 sudo[191223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvvgkoeraibdhvtxznfabrbdoeogjwll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225251.3523815-513-98285451547668/AnsiballZ_stat.py'
Jan 12 13:40:51 compute-0 sudo[191223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:51 compute-0 python3.9[191225]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:40:52 compute-0 sudo[191223]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:52 compute-0 sudo[191374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpecrdwkzttzaytaiwxxxjfbubykhihz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225252.0473194-513-70556203794518/AnsiballZ_copy.py'
Jan 12 13:40:52 compute-0 sudo[191374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:52 compute-0 python3.9[191376]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768225252.0473194-513-70556203794518/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:52 compute-0 sudo[191374]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:52 compute-0 sudo[191450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kufzsfhdzzpixybprqjdgasqkqivnsom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225252.0473194-513-70556203794518/AnsiballZ_systemd.py'
Jan 12 13:40:52 compute-0 sudo[191450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:53 compute-0 python3.9[191452]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:40:53 compute-0 systemd[1]: Reloading.
Jan 12 13:40:53 compute-0 systemd-sysv-generator[191479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:40:53 compute-0 systemd-rc-local-generator[191475]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:40:53 compute-0 sudo[191450]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:53 compute-0 sudo[191561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjzhdsbaeboqkpfidwffnjnwifowqwov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225252.0473194-513-70556203794518/AnsiballZ_systemd.py'
Jan 12 13:40:53 compute-0 sudo[191561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:53 compute-0 python3.9[191563]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:40:53 compute-0 systemd[1]: Reloading.
Jan 12 13:40:53 compute-0 systemd-rc-local-generator[191587]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:40:53 compute-0 systemd-sysv-generator[191591]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:40:54 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Jan 12 13:40:54 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:40:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87ed4cf60adc3665366d7f24464a1cbcf402bb0380b3eabd22d5feb82ee98d2e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87ed4cf60adc3665366d7f24464a1cbcf402bb0380b3eabd22d5feb82ee98d2e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87ed4cf60adc3665366d7f24464a1cbcf402bb0380b3eabd22d5feb82ee98d2e/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87ed4cf60adc3665366d7f24464a1cbcf402bb0380b3eabd22d5feb82ee98d2e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 12 13:40:54 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c.
Jan 12 13:40:54 compute-0 podman[191603]: 2026-01-12 13:40:54.143309834 +0000 UTC m=+0.075768100 container init 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: + sudo -E kolla_set_configs
Jan 12 13:40:54 compute-0 podman[191603]: 2026-01-12 13:40:54.160855129 +0000 UTC m=+0.093313374 container start 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:40:54 compute-0 sudo[191621]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Jan 12 13:40:54 compute-0 podman[191603]: ceilometer_agent_compute
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: sudo: unable to send audit message: Operation not permitted
Jan 12 13:40:54 compute-0 sudo[191621]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 12 13:40:54 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Jan 12 13:40:54 compute-0 sudo[191621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 12 13:40:54 compute-0 sudo[191561]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Validating config file
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Copying service configuration files
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: INFO:__main__:Writing out command to execute
Jan 12 13:40:54 compute-0 sudo[191621]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: ++ cat /run_command
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: + ARGS=
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: + sudo kolla_copy_cacerts
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: sudo: unable to send audit message: Operation not permitted
Jan 12 13:40:54 compute-0 sudo[191648]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Jan 12 13:40:54 compute-0 sudo[191648]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 12 13:40:54 compute-0 sudo[191648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 12 13:40:54 compute-0 sudo[191648]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: + [[ ! -n '' ]]
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: + . kolla_extend_start
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: + umask 0022
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 12 13:40:54 compute-0 podman[191622]: 2026-01-12 13:40:54.243444128 +0000 UTC m=+0.076632171 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 12 13:40:54 compute-0 systemd[1]: 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c-59e85403964c28a6.service: Main process exited, code=exited, status=1/FAILURE
Jan 12 13:40:54 compute-0 systemd[1]: 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c-59e85403964c28a6.service: Failed with result 'exit-code'.
Jan 12 13:40:54 compute-0 python3.9[191793]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.931 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.932 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.932 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.932 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.933 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.933 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.933 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.933 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.933 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.933 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.933 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.933 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.934 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.934 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.934 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.934 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.934 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.934 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.934 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.934 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.935 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.935 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.935 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.935 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.935 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.935 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.935 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.935 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.936 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.936 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.936 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.936 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.936 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.936 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.936 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.936 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.936 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.937 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.937 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.937 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.937 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.937 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.937 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.937 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.937 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.938 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.938 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.938 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.938 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.938 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.938 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.938 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.938 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.939 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.939 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.939 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.939 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.939 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.939 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.939 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.939 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.939 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.940 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.940 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.940 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.940 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.940 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.940 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.940 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.940 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.941 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.941 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.941 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.941 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.941 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.941 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.941 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.941 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.942 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.942 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.942 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.942 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.942 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.942 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.942 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.942 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.942 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.943 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.943 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.943 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.943 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.943 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.943 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.943 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.943 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.944 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.944 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.944 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.944 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.944 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.944 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.944 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.945 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.945 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.945 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.945 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.945 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.945 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.946 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.946 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.946 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.946 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.946 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.946 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.946 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.946 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.947 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.947 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.947 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.947 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.947 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.947 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.947 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.947 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.948 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.948 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.948 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.948 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.948 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.948 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.948 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.948 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.949 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.949 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.949 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.949 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.949 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.949 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.949 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.949 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.950 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.950 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.950 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.950 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.950 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.950 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.950 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.950 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.950 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.951 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.951 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.951 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.951 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.951 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.951 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.951 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.951 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.952 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.952 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.952 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.952 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.952 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.952 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.952 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.952 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.968 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.969 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 12 13:40:54 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:54.969 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.029 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.094 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.094 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.094 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.094 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.095 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.095 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.095 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.095 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.095 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.095 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.095 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.095 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.095 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.096 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.097 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.098 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.099 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.100 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.101 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.103 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.104 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.105 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.106 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.114 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.114 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.116 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.120 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:40:55.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:40:55 compute-0 sudo[191949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xljiowjklawnhoydhswbejhnsgrofyvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225255.1421883-558-16469735886137/AnsiballZ_stat.py'
Jan 12 13:40:55 compute-0 sudo[191949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:55 compute-0 python3.9[191951]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:55 compute-0 sudo[191949]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:55 compute-0 sudo[192074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbzpaelfqhkwccmovqdvfloaioyarkaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225255.1421883-558-16469735886137/AnsiballZ_copy.py'
Jan 12 13:40:55 compute-0 sudo[192074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:55 compute-0 python3.9[192076]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225255.1421883-558-16469735886137/.source.yaml _original_basename=.z0gwy9o3 follow=False checksum=ca0aededc2de247899e1ccd51cb090384d834b58 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:55 compute-0 sudo[192074]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:56 compute-0 sudo[192226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-schulkdzrsdhndnhwrbtccklvzkfrofs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225256.024819-573-193832934642768/AnsiballZ_stat.py'
Jan 12 13:40:56 compute-0 sudo[192226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:56 compute-0 python3.9[192228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:56 compute-0 sudo[192226]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:56 compute-0 sudo[192349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpgwasyjqinsnxxwoytwdavbjhxkkezd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225256.024819-573-193832934642768/AnsiballZ_copy.py'
Jan 12 13:40:56 compute-0 sudo[192349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:56 compute-0 python3.9[192351]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225256.024819-573-193832934642768/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:56 compute-0 sudo[192349]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:57 compute-0 sudo[192501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siygxchqroyldwfyglnpvixzyaohicln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225257.2197778-594-220015482419708/AnsiballZ_file.py'
Jan 12 13:40:57 compute-0 sudo[192501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:57 compute-0 python3.9[192503]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:57 compute-0 sudo[192501]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:57 compute-0 sudo[192653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alcxrlehbdkeqrrsbptpjctixwkydiqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225257.7648563-602-61864226633323/AnsiballZ_file.py'
Jan 12 13:40:57 compute-0 sudo[192653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:58 compute-0 python3.9[192655]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:40:58 compute-0 sudo[192653]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:58 compute-0 sudo[192805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evmgxoshhiwaboucebrndiaknktfakiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225258.2717843-610-191731590895997/AnsiballZ_stat.py'
Jan 12 13:40:58 compute-0 sudo[192805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:58 compute-0 python3.9[192807]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:40:58 compute-0 sudo[192805]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:58 compute-0 sudo[192883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fooemylesqikbiepjaakvysmgsuuluon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225258.2717843-610-191731590895997/AnsiballZ_file.py'
Jan 12 13:40:58 compute-0 sudo[192883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:40:58 compute-0 python3.9[192885]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.yfmscrsm recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:40:58 compute-0 sudo[192883]: pam_unix(sudo:session): session closed for user root
Jan 12 13:40:59 compute-0 python3.9[193035]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:00 compute-0 sudo[193456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thkbnebaglyqvaxxkwekzqtdbdvufhrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225260.4534898-647-191287827427342/AnsiballZ_container_config_data.py'
Jan 12 13:41:00 compute-0 sudo[193456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:00 compute-0 python3.9[193458]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 12 13:41:00 compute-0 sudo[193456]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:01 compute-0 sudo[193608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppzbfqgcsziogjpzdlwkrpwgwuanvlhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225261.0692744-658-214597353871143/AnsiballZ_container_config_hash.py'
Jan 12 13:41:01 compute-0 sudo[193608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:01 compute-0 python3.9[193610]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 12 13:41:01 compute-0 sudo[193608]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:01 compute-0 sudo[193760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vypavhvfqaxxxgmyybpruisdtglrjiei ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768225261.643801-668-55976713162284/AnsiballZ_edpm_container_manage.py'
Jan 12 13:41:01 compute-0 sudo[193760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:02 compute-0 python3[193762]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 12 13:41:02 compute-0 podman[193791]: 2026-01-12 13:41:02.160673701 +0000 UTC m=+0.024921835 container create c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:41:02 compute-0 podman[193791]: 2026-01-12 13:41:02.14853056 +0000 UTC m=+0.012778692 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Jan 12 13:41:02 compute-0 python3[193762]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 12 13:41:02 compute-0 sudo[193760]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:02 compute-0 sudo[193968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpaawiwnqlnpapukeywjuwemxjqvooen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225262.3705888-676-44298838637268/AnsiballZ_stat.py'
Jan 12 13:41:02 compute-0 sudo[193968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:02 compute-0 python3.9[193970]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:41:02 compute-0 sudo[193968]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:03 compute-0 sudo[194122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgvtvpdpwdrucnndbufhwrorhnxpudkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225262.8650959-685-265679161944508/AnsiballZ_file.py'
Jan 12 13:41:03 compute-0 sudo[194122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:03 compute-0 python3.9[194124]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:03 compute-0 sudo[194122]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:03 compute-0 sudo[194198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxccdzlxrwwcqelfhyicghakpkbygjno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225262.8650959-685-265679161944508/AnsiballZ_stat.py'
Jan 12 13:41:03 compute-0 sudo[194198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:03 compute-0 python3.9[194200]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:41:03 compute-0 sudo[194198]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:03 compute-0 podman[194201]: 2026-01-12 13:41:03.573580322 +0000 UTC m=+0.066669113 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:41:03 compute-0 sudo[194373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwfwwiazstzlmgaeamrgmyoghgvtaseb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225263.5487027-685-147087825483214/AnsiballZ_copy.py'
Jan 12 13:41:03 compute-0 sudo[194373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:04 compute-0 python3.9[194375]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768225263.5487027-685-147087825483214/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:04 compute-0 sudo[194373]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:04 compute-0 sudo[194449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnvbroukqqivvlxeafklqtmoduthcwer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225263.5487027-685-147087825483214/AnsiballZ_systemd.py'
Jan 12 13:41:04 compute-0 sudo[194449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:04 compute-0 python3.9[194451]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:41:04 compute-0 systemd[1]: Reloading.
Jan 12 13:41:04 compute-0 systemd-rc-local-generator[194476]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:41:04 compute-0 systemd-sysv-generator[194480]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:41:04 compute-0 sudo[194449]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:04 compute-0 sudo[194559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czxcqzfghpbbirlnllsaleewfldmaqyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225263.5487027-685-147087825483214/AnsiballZ_systemd.py'
Jan 12 13:41:04 compute-0 sudo[194559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:05 compute-0 python3.9[194561]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:41:05 compute-0 systemd[1]: Reloading.
Jan 12 13:41:05 compute-0 systemd-rc-local-generator[194584]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:41:05 compute-0 systemd-sysv-generator[194588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:41:05 compute-0 systemd[1]: Starting node_exporter container...
Jan 12 13:41:05 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:41:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dad32b552e166db0dde4e67dfa93e54bf6923354f88b97de51a75f5681c3e1f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 12 13:41:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dad32b552e166db0dde4e67dfa93e54bf6923354f88b97de51a75f5681c3e1f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 12 13:41:05 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1.
Jan 12 13:41:05 compute-0 podman[194601]: 2026-01-12 13:41:05.443160475 +0000 UTC m=+0.072920467 container init c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.452Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.452Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.452Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.452Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.452Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.452Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.452Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.452Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.452Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=arp
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=bcache
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=bonding
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=cpu
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=edac
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=filefd
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=netclass
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=netdev
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=netstat
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=nfs
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=nvme
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=softnet
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=systemd
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=xfs
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=node_exporter.go:117 level=info collector=zfs
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 12 13:41:05 compute-0 node_exporter[194613]: ts=2026-01-12T13:41:05.453Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 12 13:41:05 compute-0 podman[194601]: 2026-01-12 13:41:05.463383423 +0000 UTC m=+0.093143415 container start c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:41:05 compute-0 podman[194601]: node_exporter
Jan 12 13:41:05 compute-0 systemd[1]: Started node_exporter container.
Jan 12 13:41:05 compute-0 sudo[194559]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:05 compute-0 podman[194622]: 2026-01-12 13:41:05.51423525 +0000 UTC m=+0.044489975 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:41:05 compute-0 python3.9[194792]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 12 13:41:06 compute-0 sudo[194942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpoxtyuzujdtcbbqgcabnohevvdgbolc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225266.3484054-730-34962997282693/AnsiballZ_stat.py'
Jan 12 13:41:06 compute-0 sudo[194942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:06 compute-0 python3.9[194944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:06 compute-0 sudo[194942]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:06 compute-0 sudo[195067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glpargflqpssxpexakbqtqqejmoirjbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225266.3484054-730-34962997282693/AnsiballZ_copy.py'
Jan 12 13:41:06 compute-0 sudo[195067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:07 compute-0 python3.9[195069]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225266.3484054-730-34962997282693/.source.yaml _original_basename=.87gp5ro6 follow=False checksum=a1b22d257da79c48a321f5b09118e5df03a513cc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:07 compute-0 sudo[195067]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:07 compute-0 sudo[195219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvwjbeakuckupitbjdlcrkymqikulxuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225267.2122324-745-78842846470674/AnsiballZ_stat.py'
Jan 12 13:41:07 compute-0 sudo[195219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:07 compute-0 python3.9[195221]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:07 compute-0 sudo[195219]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:07 compute-0 sudo[195342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucyccafadmuhbgzzwtlapsbybnimkjiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225267.2122324-745-78842846470674/AnsiballZ_copy.py'
Jan 12 13:41:07 compute-0 sudo[195342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:07 compute-0 python3.9[195344]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225267.2122324-745-78842846470674/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:41:07 compute-0 sudo[195342]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:08 compute-0 sudo[195494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmxaauqtscrpevzwnrwwxoizmgmxpsvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225268.3637774-766-75555984701033/AnsiballZ_file.py'
Jan 12 13:41:08 compute-0 sudo[195494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:08 compute-0 python3.9[195496]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:08 compute-0 sudo[195494]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:09 compute-0 sudo[195646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvrfdsddcodymflzyueuvcswvoyuajmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225268.8412197-774-127942213526790/AnsiballZ_file.py'
Jan 12 13:41:09 compute-0 sudo[195646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:09 compute-0 python3.9[195648]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:41:09 compute-0 sudo[195646]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:09 compute-0 sudo[195798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prrnprydywjuxnehbdevmtukexqougoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225269.2932837-782-106319053080530/AnsiballZ_stat.py'
Jan 12 13:41:09 compute-0 sudo[195798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:09 compute-0 python3.9[195800]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:09 compute-0 sudo[195798]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:09 compute-0 sudo[195876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyrcuiolzufktpyfakyxcbokhhjudxtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225269.2932837-782-106319053080530/AnsiballZ_file.py'
Jan 12 13:41:09 compute-0 sudo[195876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:09 compute-0 python3.9[195878]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.nczooy6g recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:09 compute-0 sudo[195876]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:10 compute-0 python3.9[196028]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:11 compute-0 sudo[196449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pomeojiacysiooivcjpmonlserddtsal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225271.509143-819-143167481898057/AnsiballZ_container_config_data.py'
Jan 12 13:41:11 compute-0 sudo[196449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:11 compute-0 python3.9[196451]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 12 13:41:11 compute-0 sudo[196449]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:12 compute-0 sudo[196610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbwcjzngvbobzyqdpibkqybbyluvgftn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225272.0944588-830-11116391953702/AnsiballZ_container_config_hash.py'
Jan 12 13:41:12 compute-0 sudo[196610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:12 compute-0 podman[196575]: 2026-01-12 13:41:12.293433384 +0000 UTC m=+0.042305341 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:41:12 compute-0 python3.9[196619]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 12 13:41:12 compute-0 sudo[196610]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:12 compute-0 sudo[196769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnsuteekcmkvyodqrnjrmtdseeishhxv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768225272.7004611-840-131640337406332/AnsiballZ_edpm_container_manage.py'
Jan 12 13:41:12 compute-0 sudo[196769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:13 compute-0 python3[196771]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.481 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.482 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.482 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.494 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.494 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.494 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.494 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.494 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.494 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.495 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.495 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.495 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.508 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.508 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.508 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.508 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.686 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.686 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6015MB free_disk=73.58267593383789GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.687 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.687 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.744 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.744 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.761 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.769 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.770 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:41:13 compute-0 nova_compute[181978]: 2026-01-12 13:41:13.770 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:41:17 compute-0 podman[196782]: 2026-01-12 13:41:17.495790985 +0000 UTC m=+4.346131888 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Jan 12 13:41:17 compute-0 podman[196863]: 2026-01-12 13:41:17.590609839 +0000 UTC m=+0.027632751 container create 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 12 13:41:17 compute-0 podman[196863]: 2026-01-12 13:41:17.578205869 +0000 UTC m=+0.015228801 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Jan 12 13:41:17 compute-0 python3[196771]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 12 13:41:17 compute-0 sudo[196769]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:17 compute-0 sudo[197040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mowgrpameulonuujchzdpqgvsznghbwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225277.8036141-848-133775810090442/AnsiballZ_stat.py'
Jan 12 13:41:17 compute-0 sudo[197040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:18 compute-0 python3.9[197042]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:41:18 compute-0 sudo[197040]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:18 compute-0 sudo[197194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvcijnefnxjznhnemcodsrxghwscrkoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225278.3528852-857-112628355864938/AnsiballZ_file.py'
Jan 12 13:41:18 compute-0 sudo[197194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:18 compute-0 python3.9[197196]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:18 compute-0 sudo[197194]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:18 compute-0 sudo[197270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfhjgvbppftihbaghxfgfeqdpnhjdpam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225278.3528852-857-112628355864938/AnsiballZ_stat.py'
Jan 12 13:41:18 compute-0 sudo[197270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:19 compute-0 python3.9[197272]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:41:19 compute-0 sudo[197270]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:19 compute-0 sudo[197421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqvxawpdemhoyjsmfawrlvyznkbmvkvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225279.0829384-857-89430188474365/AnsiballZ_copy.py'
Jan 12 13:41:19 compute-0 sudo[197421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:19 compute-0 python3.9[197423]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768225279.0829384-857-89430188474365/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:19 compute-0 sudo[197421]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:19 compute-0 sudo[197497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwpfgdropfkjmsxckneqenpiywhbfebb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225279.0829384-857-89430188474365/AnsiballZ_systemd.py'
Jan 12 13:41:19 compute-0 sudo[197497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:19 compute-0 python3.9[197499]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:41:19 compute-0 systemd[1]: Reloading.
Jan 12 13:41:20 compute-0 systemd-sysv-generator[197524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:41:20 compute-0 systemd-rc-local-generator[197521]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:41:20 compute-0 sudo[197497]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:20 compute-0 sudo[197609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooudhmxbualymqmoetkdbafxiofbvayd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225279.0829384-857-89430188474365/AnsiballZ_systemd.py'
Jan 12 13:41:20 compute-0 sudo[197609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:20 compute-0 python3.9[197611]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:41:20 compute-0 systemd[1]: Reloading.
Jan 12 13:41:20 compute-0 systemd-sysv-generator[197640]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:41:20 compute-0 systemd-rc-local-generator[197637]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:41:20 compute-0 systemd[1]: Starting podman_exporter container...
Jan 12 13:41:20 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:41:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8452cfeb9f77bd5e58f3ddb0801a7ba9b6b492e84216366046c18051db67e5e9/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 12 13:41:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8452cfeb9f77bd5e58f3ddb0801a7ba9b6b492e84216366046c18051db67e5e9/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 12 13:41:21 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa.
Jan 12 13:41:21 compute-0 podman[197650]: 2026-01-12 13:41:21.011209765 +0000 UTC m=+0.084770492 container init 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 12 13:41:21 compute-0 podman_exporter[197662]: ts=2026-01-12T13:41:21.021Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 12 13:41:21 compute-0 podman_exporter[197662]: ts=2026-01-12T13:41:21.021Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 12 13:41:21 compute-0 podman_exporter[197662]: ts=2026-01-12T13:41:21.022Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 12 13:41:21 compute-0 podman_exporter[197662]: ts=2026-01-12T13:41:21.022Z caller=handler.go:105 level=info collector=container
Jan 12 13:41:21 compute-0 systemd[1]: Starting Podman API Service...
Jan 12 13:41:21 compute-0 systemd[1]: Started Podman API Service.
Jan 12 13:41:21 compute-0 podman[197650]: 2026-01-12 13:41:21.033682927 +0000 UTC m=+0.107243634 container start 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 12 13:41:21 compute-0 podman[197650]: podman_exporter
Jan 12 13:41:21 compute-0 systemd[1]: Started podman_exporter container.
Jan 12 13:41:21 compute-0 podman[197673]: time="2026-01-12T13:41:21Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 12 13:41:21 compute-0 podman[197673]: time="2026-01-12T13:41:21Z" level=info msg="Setting parallel job count to 13"
Jan 12 13:41:21 compute-0 podman[197673]: time="2026-01-12T13:41:21Z" level=info msg="Using sqlite as database backend"
Jan 12 13:41:21 compute-0 podman[197673]: time="2026-01-12T13:41:21Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 12 13:41:21 compute-0 podman[197673]: time="2026-01-12T13:41:21Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 12 13:41:21 compute-0 podman[197673]: time="2026-01-12T13:41:21Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 12 13:41:21 compute-0 sudo[197609]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:21 compute-0 podman[197673]: @ - - [12/Jan/2026:13:41:21 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 12 13:41:21 compute-0 podman[197673]: time="2026-01-12T13:41:21Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 12 13:41:21 compute-0 podman[197673]: @ - - [12/Jan/2026:13:41:21 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18883 "" "Go-http-client/1.1"
Jan 12 13:41:21 compute-0 podman_exporter[197662]: ts=2026-01-12T13:41:21.083Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 12 13:41:21 compute-0 podman_exporter[197662]: ts=2026-01-12T13:41:21.083Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 12 13:41:21 compute-0 podman_exporter[197662]: ts=2026-01-12T13:41:21.083Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 12 13:41:21 compute-0 podman[197671]: 2026-01-12 13:41:21.088580661 +0000 UTC m=+0.052752737 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:41:21 compute-0 systemd[1]: 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa-7a92e6e8ac76ed80.service: Main process exited, code=exited, status=1/FAILURE
Jan 12 13:41:21 compute-0 systemd[1]: 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa-7a92e6e8ac76ed80.service: Failed with result 'exit-code'.
Jan 12 13:41:21 compute-0 python3.9[197852]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 12 13:41:22 compute-0 sudo[198002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdjofkoskairkygajuhrwlcfnkwcgroo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225281.9164617-902-220978719178565/AnsiballZ_stat.py'
Jan 12 13:41:22 compute-0 sudo[198002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:22 compute-0 python3.9[198004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:22 compute-0 sudo[198002]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:22 compute-0 sudo[198127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elfmttucqomoaffvsywvkmzlacaehjuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225281.9164617-902-220978719178565/AnsiballZ_copy.py'
Jan 12 13:41:22 compute-0 sudo[198127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:22 compute-0 python3.9[198129]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225281.9164617-902-220978719178565/.source.yaml _original_basename=.g4lgaqbx follow=False checksum=bb80dc18cc5330259392f7d610662c1b0ccb4297 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:22 compute-0 sudo[198127]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:22 compute-0 sudo[198279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxavhzliiaamkkumumneukggycpkfrdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225282.7926166-917-131496825902465/AnsiballZ_stat.py'
Jan 12 13:41:22 compute-0 sudo[198279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:23 compute-0 python3.9[198281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:23 compute-0 sudo[198279]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:23 compute-0 sudo[198402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfnikdspjmwydyaijlktyoosuxynvnev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225282.7926166-917-131496825902465/AnsiballZ_copy.py'
Jan 12 13:41:23 compute-0 sudo[198402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:23 compute-0 python3.9[198404]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1768225282.7926166-917-131496825902465/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:41:23 compute-0 sudo[198402]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:24 compute-0 sudo[198554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjoinkwknrltcbujnwemgenmocghurdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225284.0461526-938-62576892710712/AnsiballZ_file.py'
Jan 12 13:41:24 compute-0 sudo[198554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:24 compute-0 python3.9[198556]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:24 compute-0 sudo[198554]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:24 compute-0 podman[198557]: 2026-01-12 13:41:24.443773699 +0000 UTC m=+0.040524881 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 12 13:41:24 compute-0 systemd[1]: 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c-59e85403964c28a6.service: Main process exited, code=exited, status=1/FAILURE
Jan 12 13:41:24 compute-0 systemd[1]: 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c-59e85403964c28a6.service: Failed with result 'exit-code'.
Jan 12 13:41:24 compute-0 sudo[198723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zftmymchswydzwttiwglgrtcckumnqqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225284.5563345-946-105721021290818/AnsiballZ_file.py'
Jan 12 13:41:24 compute-0 sudo[198723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:24 compute-0 python3.9[198725]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 12 13:41:24 compute-0 sudo[198723]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:25 compute-0 sudo[198875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnxwzzgxgffkltdufmdxuvugfcxujovm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225285.0267053-954-266903716715661/AnsiballZ_stat.py'
Jan 12 13:41:25 compute-0 sudo[198875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:25 compute-0 python3.9[198877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:25 compute-0 sudo[198875]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:25 compute-0 sudo[198953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eleqwahcrnzyfnuwbrindygrdicyfkao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225285.0267053-954-266903716715661/AnsiballZ_file.py'
Jan 12 13:41:25 compute-0 sudo[198953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:25 compute-0 python3.9[198955]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.lpvkv1y5 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:25 compute-0 sudo[198953]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:26 compute-0 python3.9[199105]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:27 compute-0 sudo[199526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zemvqnlooqmshmrglyqmjdtevawibhqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225287.1548636-991-118333806581342/AnsiballZ_container_config_data.py'
Jan 12 13:41:27 compute-0 sudo[199526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:27 compute-0 python3.9[199528]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 12 13:41:27 compute-0 sudo[199526]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:27 compute-0 sudo[199678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmvtlzfcedkdwgcbuuiewgfgivnkvcoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225287.7616942-1002-122998877687968/AnsiballZ_container_config_hash.py'
Jan 12 13:41:27 compute-0 sudo[199678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:28 compute-0 python3.9[199680]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 12 13:41:28 compute-0 sudo[199678]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:28 compute-0 sudo[199830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcubxxoesxdeumpahwdqhedxeujsijtq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768225288.370728-1012-119384839715150/AnsiballZ_edpm_container_manage.py'
Jan 12 13:41:28 compute-0 sudo[199830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:28 compute-0 python3[199832]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 12 13:41:31 compute-0 podman[199842]: 2026-01-12 13:41:31.237983561 +0000 UTC m=+2.400158832 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Jan 12 13:41:31 compute-0 podman[199923]: 2026-01-12 13:41:31.329962025 +0000 UTC m=+0.029429362 container create e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6)
Jan 12 13:41:31 compute-0 podman[199923]: 2026-01-12 13:41:31.316407534 +0000 UTC m=+0.015874880 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Jan 12 13:41:31 compute-0 python3[199832]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Jan 12 13:41:31 compute-0 sudo[199830]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:31 compute-0 sudo[200100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxwhlbdeytjegijthtzzfzckyyrgyata ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225291.5302715-1020-198819609604713/AnsiballZ_stat.py'
Jan 12 13:41:31 compute-0 sudo[200100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:31 compute-0 python3.9[200102]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:41:31 compute-0 sudo[200100]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:32 compute-0 sudo[200254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgvxasitmxkwgyhdbxkcwhgrutjwptbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225292.0370119-1029-9469257003612/AnsiballZ_file.py'
Jan 12 13:41:32 compute-0 sudo[200254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:32 compute-0 python3.9[200256]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:32 compute-0 sudo[200254]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:32 compute-0 sudo[200330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvurundygbosmvykybbnnaulinlhajwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225292.0370119-1029-9469257003612/AnsiballZ_stat.py'
Jan 12 13:41:32 compute-0 sudo[200330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:32 compute-0 python3.9[200332]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:41:32 compute-0 sudo[200330]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:33 compute-0 sudo[200481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xekypifgqlfiiumypcuipyefckezwrwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225292.7454638-1029-26995112809717/AnsiballZ_copy.py'
Jan 12 13:41:33 compute-0 sudo[200481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:33 compute-0 python3.9[200483]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1768225292.7454638-1029-26995112809717/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:33 compute-0 sudo[200481]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:33 compute-0 sudo[200557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kabqomehmsgpbelugmeykjmkfylmmljg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225292.7454638-1029-26995112809717/AnsiballZ_systemd.py'
Jan 12 13:41:33 compute-0 sudo[200557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:33 compute-0 python3.9[200559]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 12 13:41:33 compute-0 systemd[1]: Reloading.
Jan 12 13:41:33 compute-0 systemd-sysv-generator[200599]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:41:33 compute-0 podman[200561]: 2026-01-12 13:41:33.718320898 +0000 UTC m=+0.068247299 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 12 13:41:33 compute-0 systemd-rc-local-generator[200595]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:41:33 compute-0 sudo[200557]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:34 compute-0 sudo[200691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtklhzeobfftrobdqfsjvhqnwtgwieym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225292.7454638-1029-26995112809717/AnsiballZ_systemd.py'
Jan 12 13:41:34 compute-0 sudo[200691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:34 compute-0 python3.9[200693]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 12 13:41:34 compute-0 systemd[1]: Reloading.
Jan 12 13:41:34 compute-0 systemd-sysv-generator[200723]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 12 13:41:34 compute-0 systemd-rc-local-generator[200719]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 12 13:41:34 compute-0 systemd[1]: Starting openstack_network_exporter container...
Jan 12 13:41:34 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:41:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78f491a1671dd74feb483f320398953fb559e2ac29ebf0a8896e634427527143/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 12 13:41:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78f491a1671dd74feb483f320398953fb559e2ac29ebf0a8896e634427527143/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 12 13:41:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78f491a1671dd74feb483f320398953fb559e2ac29ebf0a8896e634427527143/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 12 13:41:34 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd.
Jan 12 13:41:34 compute-0 podman[200732]: 2026-01-12 13:41:34.679266401 +0000 UTC m=+0.079361653 container init e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:48: registering *bridge.Collector
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:48: registering *coverage.Collector
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:48: registering *datapath.Collector
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:48: registering *iface.Collector
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:48: registering *memory.Collector
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:48: registering *ovn.Collector
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:48: registering *pmd_perf.Collector
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:48: registering *pmd_rxq.Collector
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: INFO    13:41:34 main.go:48: registering *vswitch.Collector
Jan 12 13:41:34 compute-0 openstack_network_exporter[200744]: NOTICE  13:41:34 main.go:76: listening on https://:9105/metrics
Jan 12 13:41:34 compute-0 podman[200732]: 2026-01-12 13:41:34.699055768 +0000 UTC m=+0.099151010 container start e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal)
Jan 12 13:41:34 compute-0 podman[200732]: openstack_network_exporter
Jan 12 13:41:34 compute-0 systemd[1]: Started openstack_network_exporter container.
Jan 12 13:41:34 compute-0 sudo[200691]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:34 compute-0 podman[200754]: 2026-01-12 13:41:34.757386771 +0000 UTC m=+0.050606477 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Jan 12 13:41:35 compute-0 python3.9[200922]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 12 13:41:35 compute-0 sudo[201082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhwbtuijyiwrsjefumjbhvwqxihvykks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225295.6053011-1074-259590447178964/AnsiballZ_stat.py'
Jan 12 13:41:35 compute-0 sudo[201082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:35 compute-0 podman[201046]: 2026-01-12 13:41:35.803702537 +0000 UTC m=+0.036923664 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:41:35 compute-0 python3.9[201095]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:35 compute-0 sudo[201082]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:36 compute-0 sudo[201218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwjqadevurijwubznfffyyyalplqtvri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225295.6053011-1074-259590447178964/AnsiballZ_copy.py'
Jan 12 13:41:36 compute-0 sudo[201218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:36 compute-0 python3.9[201220]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225295.6053011-1074-259590447178964/.source.yaml _original_basename=.julioyqv follow=False checksum=a2e7a21c14ad8c7be52512c0965b010bd60b6d2a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:36 compute-0 sudo[201218]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:36 compute-0 sudo[201370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svjrmzgtspxjlcpcjqptvpricxcxfijn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225296.4861383-1089-98522298774332/AnsiballZ_find.py'
Jan 12 13:41:36 compute-0 sudo[201370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:36 compute-0 python3.9[201372]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 12 13:41:36 compute-0 sudo[201370]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:37 compute-0 sudo[201522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tflmghlyyrqujbxxininbczifhvuwepz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225297.11865-1099-201278349770753/AnsiballZ_podman_container_info.py'
Jan 12 13:41:37 compute-0 sudo[201522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:37 compute-0 python3.9[201524]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 12 13:41:37 compute-0 sudo[201522]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:38 compute-0 sudo[201684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrexmiwdihhgkahgbfvrzllzokpzxkzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225297.736918-1107-197843994116692/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:38 compute-0 sudo[201684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:38 compute-0 python3.9[201686]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:38 compute-0 systemd[1]: Started libpod-conmon-317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c.scope.
Jan 12 13:41:38 compute-0 podman[201687]: 2026-01-12 13:41:38.264509686 +0000 UTC m=+0.050625142 container exec 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 12 13:41:38 compute-0 podman[201687]: 2026-01-12 13:41:38.271103129 +0000 UTC m=+0.057218564 container exec_died 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:41:38 compute-0 sudo[201684]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:38 compute-0 systemd[1]: libpod-conmon-317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c.scope: Deactivated successfully.
Jan 12 13:41:38 compute-0 sudo[201861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kktubkpnjwiyvztozilhloeijcxkrpxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225298.41641-1115-136012711382154/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:38 compute-0 sudo[201861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:38 compute-0 python3.9[201863]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:38 compute-0 systemd[1]: Started libpod-conmon-317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c.scope.
Jan 12 13:41:38 compute-0 podman[201864]: 2026-01-12 13:41:38.81905369 +0000 UTC m=+0.051495464 container exec 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:41:38 compute-0 podman[201864]: 2026-01-12 13:41:38.825058573 +0000 UTC m=+0.057500356 container exec_died 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 12 13:41:38 compute-0 sudo[201861]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:38 compute-0 systemd[1]: libpod-conmon-317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c.scope: Deactivated successfully.
Jan 12 13:41:39 compute-0 auditd[673]: Audit daemon rotating log files
Jan 12 13:41:39 compute-0 sudo[202039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfzqxhtrvxxanqredfxznpledmpvwpro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225298.9648068-1123-259131816776898/AnsiballZ_file.py'
Jan 12 13:41:39 compute-0 sudo[202039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:39 compute-0 python3.9[202041]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:39 compute-0 sudo[202039]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:39 compute-0 sudo[202191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfztdnmniwmniygoyegdcihbfzeslbbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225299.4467447-1132-62095679023207/AnsiballZ_podman_container_info.py'
Jan 12 13:41:39 compute-0 sudo[202191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:39 compute-0 python3.9[202193]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 12 13:41:39 compute-0 sudo[202191]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:40 compute-0 sudo[202353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dldvoolydtaxiompnrcdkaoppmtbhahd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225299.9311318-1140-73710158097510/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:40 compute-0 sudo[202353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:41:40.194 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:41:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:41:40.195 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:41:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:41:40.195 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:41:40 compute-0 python3.9[202355]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:40 compute-0 systemd[1]: Started libpod-conmon-58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184.scope.
Jan 12 13:41:40 compute-0 podman[202356]: 2026-01-12 13:41:40.328430693 +0000 UTC m=+0.040422898 container exec 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:41:40 compute-0 podman[202373]: 2026-01-12 13:41:40.383961432 +0000 UTC m=+0.044796692 container exec_died 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 12 13:41:40 compute-0 podman[202356]: 2026-01-12 13:41:40.386777366 +0000 UTC m=+0.098769580 container exec_died 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 12 13:41:40 compute-0 systemd[1]: libpod-conmon-58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184.scope: Deactivated successfully.
Jan 12 13:41:40 compute-0 sudo[202353]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:40 compute-0 sudo[202532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hykihgmyuallddtfajvptfgfgstdzkkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225300.524352-1148-111650816432181/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:40 compute-0 sudo[202532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:40 compute-0 python3.9[202534]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:40 compute-0 systemd[1]: Started libpod-conmon-58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184.scope.
Jan 12 13:41:40 compute-0 podman[202535]: 2026-01-12 13:41:40.933666064 +0000 UTC m=+0.048663390 container exec 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:41:40 compute-0 podman[202550]: 2026-01-12 13:41:40.986712213 +0000 UTC m=+0.043077218 container exec_died 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 12 13:41:40 compute-0 podman[202535]: 2026-01-12 13:41:40.989172777 +0000 UTC m=+0.104170074 container exec_died 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 12 13:41:40 compute-0 systemd[1]: libpod-conmon-58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184.scope: Deactivated successfully.
Jan 12 13:41:41 compute-0 sudo[202532]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:41 compute-0 sudo[202709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwjnqxtxtgkwsennzwlskyrcmwmblxwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225301.1207967-1156-46889628846395/AnsiballZ_file.py'
Jan 12 13:41:41 compute-0 sudo[202709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:41 compute-0 python3.9[202711]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:41 compute-0 sudo[202709]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:41 compute-0 sudo[202861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osxrqrngmwglabuutolfovsllmpstqtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225301.6266034-1165-12546068352624/AnsiballZ_podman_container_info.py'
Jan 12 13:41:41 compute-0 sudo[202861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:41 compute-0 python3.9[202863]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 12 13:41:41 compute-0 sudo[202861]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:42 compute-0 sudo[203022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsnirotaylfrjbozaassqzughpvbeifk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225302.1142437-1173-278767759603103/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:42 compute-0 sudo[203022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:42 compute-0 podman[203024]: 2026-01-12 13:41:42.358449791 +0000 UTC m=+0.035867350 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:41:42 compute-0 python3.9[203025]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:42 compute-0 systemd[1]: Started libpod-conmon-6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c.scope.
Jan 12 13:41:42 compute-0 podman[203041]: 2026-01-12 13:41:42.539978795 +0000 UTC m=+0.049635574 container exec 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 12 13:41:42 compute-0 podman[203057]: 2026-01-12 13:41:42.594136423 +0000 UTC m=+0.044489763 container exec_died 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 12 13:41:42 compute-0 podman[203041]: 2026-01-12 13:41:42.597486455 +0000 UTC m=+0.107143244 container exec_died 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 12 13:41:42 compute-0 systemd[1]: libpod-conmon-6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c.scope: Deactivated successfully.
Jan 12 13:41:42 compute-0 sudo[203022]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:42 compute-0 sudo[203216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gthdliwmsftscktvfiqapbvpaiweubti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225302.7484214-1181-214086606525658/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:42 compute-0 sudo[203216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:43 compute-0 python3.9[203218]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:43 compute-0 systemd[1]: Started libpod-conmon-6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c.scope.
Jan 12 13:41:43 compute-0 podman[203219]: 2026-01-12 13:41:43.158951552 +0000 UTC m=+0.049069596 container exec 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 12 13:41:43 compute-0 podman[203235]: 2026-01-12 13:41:43.213970394 +0000 UTC m=+0.044773658 container exec_died 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 12 13:41:43 compute-0 podman[203219]: 2026-01-12 13:41:43.216911274 +0000 UTC m=+0.107029328 container exec_died 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 12 13:41:43 compute-0 systemd[1]: libpod-conmon-6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c.scope: Deactivated successfully.
Jan 12 13:41:43 compute-0 sudo[203216]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:43 compute-0 sudo[203395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqnnufuksjjjkqsrpcbtikqubrbjfgrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225303.3629994-1189-54757291733190/AnsiballZ_file.py'
Jan 12 13:41:43 compute-0 sudo[203395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:43 compute-0 python3.9[203397]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:43 compute-0 sudo[203395]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:44 compute-0 sudo[203547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjshbbpnoffngfiljbvbnwpzlvwkuphy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225303.8596263-1198-13267960901962/AnsiballZ_podman_container_info.py'
Jan 12 13:41:44 compute-0 sudo[203547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:44 compute-0 python3.9[203549]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 12 13:41:44 compute-0 sudo[203547]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:44 compute-0 sudo[203709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igheodbuhhirvolaxxdpsgtukcshuoom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225304.362484-1206-206432496816177/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:44 compute-0 sudo[203709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:44 compute-0 python3.9[203711]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:44 compute-0 systemd[1]: Started libpod-conmon-c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1.scope.
Jan 12 13:41:44 compute-0 podman[203712]: 2026-01-12 13:41:44.761490281 +0000 UTC m=+0.044050824 container exec c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:41:44 compute-0 podman[203728]: 2026-01-12 13:41:44.815983542 +0000 UTC m=+0.044966853 container exec_died c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 12 13:41:44 compute-0 podman[203712]: 2026-01-12 13:41:44.818477238 +0000 UTC m=+0.101037771 container exec_died c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:41:44 compute-0 systemd[1]: libpod-conmon-c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1.scope: Deactivated successfully.
Jan 12 13:41:44 compute-0 sudo[203709]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:45 compute-0 sudo[203886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pafdnchavrkrjnsmztwcsyogiqlqebsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225304.953511-1214-46802711594560/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:45 compute-0 sudo[203886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:45 compute-0 python3.9[203888]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:45 compute-0 systemd[1]: Started libpod-conmon-c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1.scope.
Jan 12 13:41:45 compute-0 podman[203889]: 2026-01-12 13:41:45.357253636 +0000 UTC m=+0.050541774 container exec c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:41:45 compute-0 podman[203906]: 2026-01-12 13:41:45.410979589 +0000 UTC m=+0.044270519 container exec_died c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 12 13:41:45 compute-0 podman[203889]: 2026-01-12 13:41:45.41482723 +0000 UTC m=+0.108115368 container exec_died c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:41:45 compute-0 systemd[1]: libpod-conmon-c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1.scope: Deactivated successfully.
Jan 12 13:41:45 compute-0 sudo[203886]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:45 compute-0 sudo[204065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpobzkzxkwrbufzrsxmpshwpwlierysf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225305.5576339-1222-208501795293641/AnsiballZ_file.py'
Jan 12 13:41:45 compute-0 sudo[204065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:45 compute-0 python3.9[204067]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:45 compute-0 sudo[204065]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:46 compute-0 sudo[204217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zirucaznifvjxcibjlrykvjpurbgcoba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225306.080742-1231-238694450523805/AnsiballZ_podman_container_info.py'
Jan 12 13:41:46 compute-0 sudo[204217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:46 compute-0 python3.9[204219]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 12 13:41:46 compute-0 sudo[204217]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:46 compute-0 sudo[204379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-varwwpwhkmasiwqjjnwlcfrsgfxwnjmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225306.5982218-1239-190131488768543/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:46 compute-0 sudo[204379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:46 compute-0 python3.9[204381]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:46 compute-0 systemd[1]: Started libpod-conmon-82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa.scope.
Jan 12 13:41:46 compute-0 podman[204382]: 2026-01-12 13:41:46.989764339 +0000 UTC m=+0.040803115 container exec 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 12 13:41:47 compute-0 podman[204398]: 2026-01-12 13:41:47.043956111 +0000 UTC m=+0.045433313 container exec_died 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:41:47 compute-0 podman[204382]: 2026-01-12 13:41:47.046550827 +0000 UTC m=+0.097589593 container exec_died 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:41:47 compute-0 systemd[1]: libpod-conmon-82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa.scope: Deactivated successfully.
Jan 12 13:41:47 compute-0 sudo[204379]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:47 compute-0 sudo[204556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygjkpbupbvpbnrbdxfvwjelewmvhpvyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225307.185862-1247-195650040752066/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:47 compute-0 sudo[204556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:47 compute-0 python3.9[204558]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:47 compute-0 systemd[1]: Started libpod-conmon-82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa.scope.
Jan 12 13:41:47 compute-0 podman[204559]: 2026-01-12 13:41:47.59955148 +0000 UTC m=+0.053502892 container exec 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 12 13:41:47 compute-0 podman[204575]: 2026-01-12 13:41:47.654946162 +0000 UTC m=+0.043889238 container exec_died 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 12 13:41:47 compute-0 podman[204559]: 2026-01-12 13:41:47.65717096 +0000 UTC m=+0.111122352 container exec_died 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 12 13:41:47 compute-0 systemd[1]: libpod-conmon-82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa.scope: Deactivated successfully.
Jan 12 13:41:47 compute-0 sudo[204556]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:47 compute-0 sudo[204734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnjhsqigtrbauxkdqdxdcixbcmuzwclr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225307.7997181-1255-184855499087301/AnsiballZ_file.py'
Jan 12 13:41:47 compute-0 sudo[204734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:48 compute-0 python3.9[204736]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:48 compute-0 sudo[204734]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:48 compute-0 sudo[204886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbqsaqrrxynayzzeuuvjowdvocuiysls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225308.2931497-1264-122930546769784/AnsiballZ_podman_container_info.py'
Jan 12 13:41:48 compute-0 sudo[204886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:48 compute-0 python3.9[204888]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 12 13:41:48 compute-0 sudo[204886]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:48 compute-0 sudo[205047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewalpgliicfxkrfupkhsgvxduumjcnwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225308.8069215-1272-231536135657382/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:48 compute-0 sudo[205047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:49 compute-0 python3.9[205049]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:49 compute-0 systemd[1]: Started libpod-conmon-e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd.scope.
Jan 12 13:41:49 compute-0 podman[205050]: 2026-01-12 13:41:49.193502131 +0000 UTC m=+0.041902650 container exec e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.6, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 12 13:41:49 compute-0 podman[205050]: 2026-01-12 13:41:49.200032436 +0000 UTC m=+0.048432953 container exec_died e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container)
Jan 12 13:41:49 compute-0 sudo[205047]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:49 compute-0 systemd[1]: libpod-conmon-e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd.scope: Deactivated successfully.
Jan 12 13:41:49 compute-0 sudo[205225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjktijnrqdhksbwjzygqzsssbwaonxxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225309.3338783-1280-2221395156030/AnsiballZ_podman_container_exec.py'
Jan 12 13:41:49 compute-0 sudo[205225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:49 compute-0 python3.9[205227]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 12 13:41:49 compute-0 systemd[1]: Started libpod-conmon-e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd.scope.
Jan 12 13:41:49 compute-0 podman[205228]: 2026-01-12 13:41:49.711969242 +0000 UTC m=+0.039522789 container exec e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Jan 12 13:41:49 compute-0 podman[205244]: 2026-01-12 13:41:49.762952889 +0000 UTC m=+0.042868914 container exec_died e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Jan 12 13:41:49 compute-0 podman[205228]: 2026-01-12 13:41:49.765957439 +0000 UTC m=+0.093510987 container exec_died e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Jan 12 13:41:49 compute-0 systemd[1]: libpod-conmon-e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd.scope: Deactivated successfully.
Jan 12 13:41:49 compute-0 sudo[205225]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:50 compute-0 sudo[205402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxbuancgufscwrgsbyloanmpghqbmqgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225309.9118388-1288-189656060976648/AnsiballZ_file.py'
Jan 12 13:41:50 compute-0 sudo[205402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:50 compute-0 python3.9[205404]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:50 compute-0 sudo[205402]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:50 compute-0 sudo[205554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idxrmhcgakjpqiephlspozwocwbtpslj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225310.4134538-1297-261301345359477/AnsiballZ_file.py'
Jan 12 13:41:50 compute-0 sudo[205554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:50 compute-0 python3.9[205556]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:50 compute-0 sudo[205554]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:51 compute-0 sudo[205706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqvvrbnmvyyfqpnznqphntoghflcxyvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225310.872455-1305-270826571755284/AnsiballZ_stat.py'
Jan 12 13:41:51 compute-0 sudo[205706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:51 compute-0 python3.9[205708]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:51 compute-0 sudo[205706]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:51 compute-0 sudo[205838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bitqszycdfszprbevjoqmukocgxwapks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225310.872455-1305-270826571755284/AnsiballZ_copy.py'
Jan 12 13:41:51 compute-0 sudo[205838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:51 compute-0 podman[205803]: 2026-01-12 13:41:51.445391337 +0000 UTC m=+0.042854687 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 12 13:41:51 compute-0 python3.9[205845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1768225310.872455-1305-270826571755284/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:51 compute-0 sudo[205838]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:51 compute-0 sudo[206002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvgeusfakrgqftlpygmkdkvcefvejlyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225311.7918863-1321-85047345467836/AnsiballZ_file.py'
Jan 12 13:41:51 compute-0 sudo[206002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:52 compute-0 python3.9[206004]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:52 compute-0 sudo[206002]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:52 compute-0 sudo[206154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwiaxapnthuqlcchwzpnyqljjptyhqhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225312.2509434-1329-190669251648112/AnsiballZ_stat.py'
Jan 12 13:41:52 compute-0 sudo[206154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:52 compute-0 python3.9[206156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:52 compute-0 sudo[206154]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:52 compute-0 sudo[206232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwebqcxtemihjwrdwddfrtxydigzyiei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225312.2509434-1329-190669251648112/AnsiballZ_file.py'
Jan 12 13:41:52 compute-0 sudo[206232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:52 compute-0 python3.9[206234]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:52 compute-0 sudo[206232]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:53 compute-0 sudo[206384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxsmikpgypzrmcvkazkizycqngqswthn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225313.015614-1341-279242522300174/AnsiballZ_stat.py'
Jan 12 13:41:53 compute-0 sudo[206384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:53 compute-0 python3.9[206386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:53 compute-0 sudo[206384]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:53 compute-0 sudo[206462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyliwxpkpdjbojiunxtwmgcurxriczrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225313.015614-1341-279242522300174/AnsiballZ_file.py'
Jan 12 13:41:53 compute-0 sudo[206462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:53 compute-0 python3.9[206464]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.c12wd82j recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:53 compute-0 sudo[206462]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:53 compute-0 sudo[206614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdpspbpienpgcipxgmrtrqpshdomegyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225313.7814853-1353-234383192477037/AnsiballZ_stat.py'
Jan 12 13:41:53 compute-0 sudo[206614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:54 compute-0 python3.9[206616]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:54 compute-0 sudo[206614]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:54 compute-0 sudo[206692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlzvducuqdwzupbgwohjujxhhokhdnjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225313.7814853-1353-234383192477037/AnsiballZ_file.py'
Jan 12 13:41:54 compute-0 sudo[206692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:54 compute-0 python3.9[206694]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:54 compute-0 sudo[206692]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:54 compute-0 podman[206719]: 2026-01-12 13:41:54.540684237 +0000 UTC m=+0.038520638 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true)
Jan 12 13:41:54 compute-0 sudo[206861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qixxpwutddtfddtgfzhlrhyfmrrbtbny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225314.5924182-1366-38413939750452/AnsiballZ_command.py'
Jan 12 13:41:54 compute-0 sudo[206861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:54 compute-0 python3.9[206863]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:41:54 compute-0 sudo[206861]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:55 compute-0 sudo[207014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuvyexyivjbqufuxearbpfixvpmuawoo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1768225315.0614922-1374-156726083697031/AnsiballZ_edpm_nftables_from_files.py'
Jan 12 13:41:55 compute-0 sudo[207014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:55 compute-0 python3[207016]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 12 13:41:55 compute-0 sudo[207014]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:55 compute-0 sudo[207166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brtllgilsfczesclbjtpadaoiqnkpxqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225315.6647425-1382-42253668212844/AnsiballZ_stat.py'
Jan 12 13:41:55 compute-0 sudo[207166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:56 compute-0 python3.9[207168]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:56 compute-0 sudo[207166]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:56 compute-0 sudo[207244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvlllhpvdprfudkvgxqjazdxgcjwjqgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225315.6647425-1382-42253668212844/AnsiballZ_file.py'
Jan 12 13:41:56 compute-0 sudo[207244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:56 compute-0 python3.9[207246]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:56 compute-0 sudo[207244]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:56 compute-0 sudo[207396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkxdxgzqvggcahxfoiebheiyhgmvqggq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225316.5084553-1394-57271350178076/AnsiballZ_stat.py'
Jan 12 13:41:56 compute-0 sudo[207396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:56 compute-0 python3.9[207398]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:56 compute-0 sudo[207396]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:57 compute-0 sudo[207474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dodxyiwxnehibzvrdfypadklbfcqrgyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225316.5084553-1394-57271350178076/AnsiballZ_file.py'
Jan 12 13:41:57 compute-0 sudo[207474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:57 compute-0 python3.9[207476]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:57 compute-0 sudo[207474]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:57 compute-0 sudo[207626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsoaqxvutucibsewikzehylhihrrneza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225317.3146389-1406-117508832365120/AnsiballZ_stat.py'
Jan 12 13:41:57 compute-0 sudo[207626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:57 compute-0 python3.9[207628]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:57 compute-0 sudo[207626]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:57 compute-0 sudo[207704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esfijkdnlxwqhrerjjnidailgoakpjfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225317.3146389-1406-117508832365120/AnsiballZ_file.py'
Jan 12 13:41:57 compute-0 sudo[207704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:58 compute-0 python3.9[207706]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:58 compute-0 sudo[207704]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:58 compute-0 sudo[207856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeypysfrxctludutxtwsxfivbdazwmli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225318.1209812-1418-6208889783800/AnsiballZ_stat.py'
Jan 12 13:41:58 compute-0 sudo[207856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:58 compute-0 python3.9[207858]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:58 compute-0 sudo[207856]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:58 compute-0 sudo[207934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfkawjtvugvgplcxhvlwkqfvdsesgjbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225318.1209812-1418-6208889783800/AnsiballZ_file.py'
Jan 12 13:41:58 compute-0 sudo[207934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:58 compute-0 python3.9[207936]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:58 compute-0 sudo[207934]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:59 compute-0 sudo[208086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqtlmomkkeufztbqiysxowybjdryriho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225318.9286363-1430-110303424073114/AnsiballZ_stat.py'
Jan 12 13:41:59 compute-0 sudo[208086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:59 compute-0 python3.9[208088]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 12 13:41:59 compute-0 sudo[208086]: pam_unix(sudo:session): session closed for user root
Jan 12 13:41:59 compute-0 sudo[208211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfvohahxzleiqtbrpuuvckgrztkwywqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225318.9286363-1430-110303424073114/AnsiballZ_copy.py'
Jan 12 13:41:59 compute-0 sudo[208211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:41:59 compute-0 python3.9[208213]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1768225318.9286363-1430-110303424073114/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:41:59 compute-0 sudo[208211]: pam_unix(sudo:session): session closed for user root
Jan 12 13:42:00 compute-0 sudo[208363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abmiuqwmgujtveoypbdcxhtcwjnpvyje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225319.848541-1445-65912202363371/AnsiballZ_file.py'
Jan 12 13:42:00 compute-0 sudo[208363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:42:00 compute-0 python3.9[208365]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:42:00 compute-0 sudo[208363]: pam_unix(sudo:session): session closed for user root
Jan 12 13:42:00 compute-0 sudo[208515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixijzpwzbaaalmcugfawldzciwznldzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225320.293622-1453-129127598294083/AnsiballZ_command.py'
Jan 12 13:42:00 compute-0 sudo[208515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:42:00 compute-0 python3.9[208517]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:42:00 compute-0 sudo[208515]: pam_unix(sudo:session): session closed for user root
Jan 12 13:42:01 compute-0 sudo[208670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykoiwkjqhyuyzfazsuwinovmtormllie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225320.7600737-1461-112481661964909/AnsiballZ_blockinfile.py'
Jan 12 13:42:01 compute-0 sudo[208670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:42:01 compute-0 python3.9[208672]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:42:01 compute-0 sudo[208670]: pam_unix(sudo:session): session closed for user root
Jan 12 13:42:01 compute-0 sudo[208822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kazmahgopawyszeavselzkrehsllixio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225321.4314525-1470-141465561358967/AnsiballZ_command.py'
Jan 12 13:42:01 compute-0 sudo[208822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:42:01 compute-0 python3.9[208824]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:42:01 compute-0 sudo[208822]: pam_unix(sudo:session): session closed for user root
Jan 12 13:42:02 compute-0 sudo[208975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foqpwpzjgcragepttivvqgvlwfequzvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225321.9077048-1478-229516506928118/AnsiballZ_stat.py'
Jan 12 13:42:02 compute-0 sudo[208975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:42:02 compute-0 python3.9[208977]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 12 13:42:02 compute-0 sudo[208975]: pam_unix(sudo:session): session closed for user root
Jan 12 13:42:02 compute-0 sudo[209129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrqilsdsfqbejxnizeuikmampkwyrwuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225322.393203-1486-1064776823676/AnsiballZ_command.py'
Jan 12 13:42:02 compute-0 sudo[209129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:42:02 compute-0 python3.9[209131]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 12 13:42:02 compute-0 sudo[209129]: pam_unix(sudo:session): session closed for user root
Jan 12 13:42:03 compute-0 sudo[209284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndpldksirpewtojmiunwxpwuetwovqhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1768225322.8624017-1494-96760640385239/AnsiballZ_file.py'
Jan 12 13:42:03 compute-0 sudo[209284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:42:03 compute-0 python3.9[209286]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 12 13:42:03 compute-0 sudo[209284]: pam_unix(sudo:session): session closed for user root
Jan 12 13:42:03 compute-0 sshd-session[182310]: Connection closed by 192.168.122.30 port 34696
Jan 12 13:42:03 compute-0 sshd-session[182307]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:42:03 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Jan 12 13:42:03 compute-0 systemd[1]: session-25.scope: Consumed 1min 13.638s CPU time.
Jan 12 13:42:03 compute-0 systemd-logind[775]: Session 25 logged out. Waiting for processes to exit.
Jan 12 13:42:03 compute-0 systemd-logind[775]: Removed session 25.
Jan 12 13:42:03 compute-0 podman[209311]: 2026-01-12 13:42:03.994420625 +0000 UTC m=+0.056664942 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 12 13:42:05 compute-0 podman[209334]: 2026-01-12 13:42:05.543341085 +0000 UTC m=+0.039353090 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 12 13:42:06 compute-0 podman[209353]: 2026-01-12 13:42:06.539362444 +0000 UTC m=+0.034949378 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:42:12 compute-0 podman[209375]: 2026-01-12 13:42:12.536405078 +0000 UTC m=+0.032431607 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 12 13:42:13 compute-0 nova_compute[181978]: 2026-01-12 13:42:13.764 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:13 compute-0 nova_compute[181978]: 2026-01-12 13:42:13.824 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:13 compute-0 nova_compute[181978]: 2026-01-12 13:42:13.824 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:13 compute-0 nova_compute[181978]: 2026-01-12 13:42:13.825 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:14 compute-0 nova_compute[181978]: 2026-01-12 13:42:14.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:14 compute-0 nova_compute[181978]: 2026-01-12 13:42:14.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:42:14 compute-0 nova_compute[181978]: 2026-01-12 13:42:14.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:42:14 compute-0 nova_compute[181978]: 2026-01-12 13:42:14.512 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:42:14 compute-0 nova_compute[181978]: 2026-01-12 13:42:14.513 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:14 compute-0 nova_compute[181978]: 2026-01-12 13:42:14.514 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.481 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.507 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.507 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.507 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.508 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.700 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.700 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5931MB free_disk=73.4188461303711GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.701 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.701 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.744 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.744 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.761 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.769 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.770 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:42:15 compute-0 nova_compute[181978]: 2026-01-12 13:42:15.770 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:42:21 compute-0 podman[209391]: 2026-01-12 13:42:21.539485837 +0000 UTC m=+0.034347403 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:42:25 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:42:25.189 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:42:25 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:42:25.190 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:42:25 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:42:25.191 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:42:25 compute-0 podman[209412]: 2026-01-12 13:42:25.540661187 +0000 UTC m=+0.033783583 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 12 13:42:34 compute-0 podman[209430]: 2026-01-12 13:42:34.552476624 +0000 UTC m=+0.048905009 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 12 13:42:36 compute-0 podman[209453]: 2026-01-12 13:42:36.53676492 +0000 UTC m=+0.032095441 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Jan 12 13:42:37 compute-0 podman[209471]: 2026-01-12 13:42:37.540564842 +0000 UTC m=+0.031636165 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:42:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:42:40.195 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:42:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:42:40.195 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:42:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:42:40.195 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:42:43 compute-0 podman[209492]: 2026-01-12 13:42:43.540357466 +0000 UTC m=+0.036673926 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:42:52 compute-0 podman[209508]: 2026-01-12 13:42:52.539512702 +0000 UTC m=+0.035308849 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:42:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:42:56 compute-0 podman[209530]: 2026-01-12 13:42:56.539344283 +0000 UTC m=+0.034572617 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 12 13:43:05 compute-0 podman[209547]: 2026-01-12 13:43:05.553489292 +0000 UTC m=+0.049562467 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 12 13:43:07 compute-0 podman[209571]: 2026-01-12 13:43:07.539303477 +0000 UTC m=+0.035564571 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal)
Jan 12 13:43:08 compute-0 podman[209589]: 2026-01-12 13:43:08.536373816 +0000 UTC m=+0.032642246 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 12 13:43:13 compute-0 nova_compute[181978]: 2026-01-12 13:43:13.770 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:43:14 compute-0 nova_compute[181978]: 2026-01-12 13:43:14.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:43:14 compute-0 nova_compute[181978]: 2026-01-12 13:43:14.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:43:14 compute-0 nova_compute[181978]: 2026-01-12 13:43:14.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:43:14 compute-0 nova_compute[181978]: 2026-01-12 13:43:14.493 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:43:14 compute-0 podman[209610]: 2026-01-12 13:43:14.541348052 +0000 UTC m=+0.034677062 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.502 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.502 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.503 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.503 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.671 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.672 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6042MB free_disk=73.4190673828125GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.672 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.672 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.909 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.909 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.934 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.944 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.945 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:43:15 compute-0 nova_compute[181978]: 2026-01-12 13:43:15.945 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:43:16 compute-0 nova_compute[181978]: 2026-01-12 13:43:16.945 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:43:16 compute-0 nova_compute[181978]: 2026-01-12 13:43:16.946 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:43:17 compute-0 nova_compute[181978]: 2026-01-12 13:43:17.476 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:43:23 compute-0 podman[209626]: 2026-01-12 13:43:23.535349087 +0000 UTC m=+0.030267804 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 12 13:43:27 compute-0 podman[209648]: 2026-01-12 13:43:27.559063429 +0000 UTC m=+0.047550917 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 12 13:43:36 compute-0 podman[209665]: 2026-01-12 13:43:36.558651846 +0000 UTC m=+0.054254685 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 12 13:43:38 compute-0 podman[209688]: 2026-01-12 13:43:38.546786258 +0000 UTC m=+0.038284131 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., container_name=openstack_network_exporter)
Jan 12 13:43:38 compute-0 podman[209706]: 2026-01-12 13:43:38.598398262 +0000 UTC m=+0.033989904 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 12 13:43:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:43:40.195 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:43:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:43:40.195 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:43:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:43:40.195 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:43:45 compute-0 podman[209727]: 2026-01-12 13:43:45.541563991 +0000 UTC m=+0.036534620 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 12 13:43:52 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:43:52.328 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:43:52 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:43:52.329 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:43:54 compute-0 podman[209744]: 2026-01-12 13:43:54.542360181 +0000 UTC m=+0.037387644 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 12 13:43:56 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:43:56.330 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:43:58 compute-0 podman[209765]: 2026-01-12 13:43:58.549359969 +0000 UTC m=+0.040713769 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:44:07 compute-0 podman[209783]: 2026-01-12 13:44:07.560131388 +0000 UTC m=+0.052416758 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.094 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "9828d316-7b89-422e-a561-fad4ab8d9a5a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.095 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.114 181991 DEBUG nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.211 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.211 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.216 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.216 181991 INFO nova.compute.claims [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.300 181991 DEBUG nova.compute.provider_tree [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.319 181991 DEBUG nova.scheduler.client.report [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.334 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.334 181991 DEBUG nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.376 181991 DEBUG nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.377 181991 DEBUG nova.network.neutron [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.394 181991 INFO nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.407 181991 DEBUG nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.460 181991 DEBUG nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.461 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.461 181991 INFO nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Creating image(s)
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.462 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.462 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.463 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.463 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:09 compute-0 nova_compute[181978]: 2026-01-12 13:44:09.463 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:09 compute-0 podman[209806]: 2026-01-12 13:44:09.538180023 +0000 UTC m=+0.036369449 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 12 13:44:09 compute-0 podman[209807]: 2026-01-12 13:44:09.541483335 +0000 UTC m=+0.037997741 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.157 181991 WARNING oslo_policy.policy [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.158 181991 WARNING oslo_policy.policy [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.160 181991 DEBUG nova.policy [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.706 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.749 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07.part --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.750 181991 DEBUG nova.virt.images [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] bcf708d4-c9eb-4a4c-9503-f846d9f4a560 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.751 181991 DEBUG nova.privsep.utils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.751 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07.part /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.802 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07.part /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07.converted" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.805 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.848 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07.converted --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.849 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.859 181991 INFO oslo.privsep.daemon [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpgsiv8d2i/privsep.sock']
Jan 12 13:44:10 compute-0 nova_compute[181978]: 2026-01-12 13:44:10.877 181991 DEBUG nova.network.neutron [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Successfully created port: a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.375 181991 INFO oslo.privsep.daemon [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Spawned new privsep daemon via rootwrap
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.300 209863 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.303 209863 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.305 209863 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.305 209863 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209863
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.435 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.488 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.489 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.490 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.498 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.540 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.541 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.558 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk 1073741824" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.558 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.559 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.600 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.601 181991 DEBUG nova.virt.disk.api [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.601 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.644 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.645 181991 DEBUG nova.virt.disk.api [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.645 181991 DEBUG nova.objects.instance [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid 9828d316-7b89-422e-a561-fad4ab8d9a5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.665 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.665 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Ensure instance console log exists: /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.666 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.666 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.666 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.762 181991 DEBUG nova.network.neutron [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Successfully updated port: a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.772 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.772 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.772 181991 DEBUG nova.network.neutron [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:44:11 compute-0 nova_compute[181978]: 2026-01-12 13:44:11.915 181991 DEBUG nova.network.neutron [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.199 181991 DEBUG nova.compute.manager [req-399e4237-8922-4dfa-8a4a-c9c3bc4c7641 req-1f84e129-f4c2-4ec2-993d-1c0c2c1d817a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received event network-changed-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.199 181991 DEBUG nova.compute.manager [req-399e4237-8922-4dfa-8a4a-c9c3bc4c7641 req-1f84e129-f4c2-4ec2-993d-1c0c2c1d817a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Refreshing instance network info cache due to event network-changed-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.199 181991 DEBUG oslo_concurrency.lockutils [req-399e4237-8922-4dfa-8a4a-c9c3bc4c7641 req-1f84e129-f4c2-4ec2-993d-1c0c2c1d817a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.389 181991 DEBUG nova.network.neutron [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Updating instance_info_cache with network_info: [{"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.406 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.407 181991 DEBUG nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Instance network_info: |[{"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.407 181991 DEBUG oslo_concurrency.lockutils [req-399e4237-8922-4dfa-8a4a-c9c3bc4c7641 req-1f84e129-f4c2-4ec2-993d-1c0c2c1d817a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.407 181991 DEBUG nova.network.neutron [req-399e4237-8922-4dfa-8a4a-c9c3bc4c7641 req-1f84e129-f4c2-4ec2-993d-1c0c2c1d817a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Refreshing network info cache for port a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.409 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Start _get_guest_xml network_info=[{"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.412 181991 WARNING nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.418 181991 DEBUG nova.virt.libvirt.host [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.418 181991 DEBUG nova.virt.libvirt.host [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.421 181991 DEBUG nova.virt.libvirt.host [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.421 181991 DEBUG nova.virt.libvirt.host [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.421 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.422 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.422 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.422 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.423 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.423 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.423 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.423 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.423 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.424 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.424 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.424 181991 DEBUG nova.virt.hardware [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.427 181991 DEBUG nova.privsep.utils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.428 181991 DEBUG nova.virt.libvirt.vif [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:44:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-199675950',display_name='tempest-TestNetworkBasicOps-server-199675950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-199675950',id=1,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBITi0YMEMSb5EvQ/nE1lRL+KinozAtP7g7HW8TDnURVyfCEr6LGxgSnlcrD0JrfvV3bguWTEDFEDbbrirZoF9elh+X76j9Nfs8oPdJ+pu8HpUc5wGK0VIDXn7GVcMT8a9Q==',key_name='tempest-TestNetworkBasicOps-1498408073',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-tx90ib4j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:44:09Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=9828d316-7b89-422e-a561-fad4ab8d9a5a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.428 181991 DEBUG nova.network.os_vif_util [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.429 181991 DEBUG nova.network.os_vif_util [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:95:cb,bridge_name='br-int',has_traffic_filtering=True,id=a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c,network=Network(e17082c0-6f0c-461a-a787-3e29d33c0965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2325a00-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.430 181991 DEBUG nova.objects.instance [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid 9828d316-7b89-422e-a561-fad4ab8d9a5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.446 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <uuid>9828d316-7b89-422e-a561-fad4ab8d9a5a</uuid>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <name>instance-00000001</name>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-199675950</nova:name>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:44:12</nova:creationTime>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:44:12 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:44:12 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:44:12 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:44:12 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:44:12 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:44:12 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:44:12 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:44:12 compute-0 nova_compute[181978]:         <nova:port uuid="a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c">
Jan 12 13:44:12 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <system>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <entry name="serial">9828d316-7b89-422e-a561-fad4ab8d9a5a</entry>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <entry name="uuid">9828d316-7b89-422e-a561-fad4ab8d9a5a</entry>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     </system>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <os>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   </os>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <features>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   </features>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.config"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:18:95:cb"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <target dev="tapa2325a00-e1"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/console.log" append="off"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <video>
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     </video>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:44:12 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:44:12 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:44:12 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:44:12 compute-0 nova_compute[181978]: </domain>
Jan 12 13:44:12 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.447 181991 DEBUG nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Preparing to wait for external event network-vif-plugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.447 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.447 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.447 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.448 181991 DEBUG nova.virt.libvirt.vif [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:44:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-199675950',display_name='tempest-TestNetworkBasicOps-server-199675950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-199675950',id=1,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBITi0YMEMSb5EvQ/nE1lRL+KinozAtP7g7HW8TDnURVyfCEr6LGxgSnlcrD0JrfvV3bguWTEDFEDbbrirZoF9elh+X76j9Nfs8oPdJ+pu8HpUc5wGK0VIDXn7GVcMT8a9Q==',key_name='tempest-TestNetworkBasicOps-1498408073',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-tx90ib4j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:44:09Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=9828d316-7b89-422e-a561-fad4ab8d9a5a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.448 181991 DEBUG nova.network.os_vif_util [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.448 181991 DEBUG nova.network.os_vif_util [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:95:cb,bridge_name='br-int',has_traffic_filtering=True,id=a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c,network=Network(e17082c0-6f0c-461a-a787-3e29d33c0965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2325a00-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.449 181991 DEBUG os_vif [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:95:cb,bridge_name='br-int',has_traffic_filtering=True,id=a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c,network=Network(e17082c0-6f0c-461a-a787-3e29d33c0965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2325a00-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.474 181991 DEBUG ovsdbapp.backend.ovs_idl [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.475 181991 DEBUG ovsdbapp.backend.ovs_idl [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.475 181991 DEBUG ovsdbapp.backend.ovs_idl [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.475 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.476 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.476 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.476 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.477 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.478 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.485 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.485 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.485 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:44:12 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.486 181991 INFO oslo.privsep.daemon [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp3dj8tz3p/privsep.sock']
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.002 181991 INFO oslo.privsep.daemon [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Spawned new privsep daemon via rootwrap
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.925 209884 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.928 209884 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.929 209884 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:12.930 209884 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209884
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.249 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.249 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2325a00-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.250 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2325a00-e1, col_values=(('external_ids', {'iface-id': 'a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:95:cb', 'vm-uuid': '9828d316-7b89-422e-a561-fad4ab8d9a5a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:13 compute-0 NetworkManager[55211]: <info>  [1768225453.2517] manager: (tapa2325a00-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.251 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.254 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.257 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.257 181991 INFO os_vif [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:95:cb,bridge_name='br-int',has_traffic_filtering=True,id=a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c,network=Network(e17082c0-6f0c-461a-a787-3e29d33c0965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2325a00-e1')
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.324 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.325 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.325 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:18:95:cb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.325 181991 INFO nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Using config drive
Jan 12 13:44:13 compute-0 nova_compute[181978]: 2026-01-12 13:44:13.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:14 compute-0 nova_compute[181978]: 2026-01-12 13:44:14.450 181991 INFO nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Creating config drive at /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.config
Jan 12 13:44:14 compute-0 nova_compute[181978]: 2026-01-12 13:44:14.454 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_cc_tpt4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:14 compute-0 nova_compute[181978]: 2026-01-12 13:44:14.571 181991 DEBUG oslo_concurrency.processutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_cc_tpt4" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:14 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 12 13:44:14 compute-0 kernel: tapa2325a00-e1: entered promiscuous mode
Jan 12 13:44:14 compute-0 NetworkManager[55211]: <info>  [1768225454.6170] manager: (tapa2325a00-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Jan 12 13:44:14 compute-0 nova_compute[181978]: 2026-01-12 13:44:14.617 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:14 compute-0 ovn_controller[94974]: 2026-01-12T13:44:14Z|00027|binding|INFO|Claiming lport a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c for this chassis.
Jan 12 13:44:14 compute-0 ovn_controller[94974]: 2026-01-12T13:44:14Z|00028|binding|INFO|a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c: Claiming fa:16:3e:18:95:cb 10.100.0.12
Jan 12 13:44:14 compute-0 nova_compute[181978]: 2026-01-12 13:44:14.620 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:14 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:14.628 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:95:cb 10.100.0.12'], port_security=['fa:16:3e:18:95:cb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e17082c0-6f0c-461a-a787-3e29d33c0965', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6348b0d4-7e0c-43c9-b18b-1192deea413c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f553e1bd-6f22-4cf5-81ec-96063b5f8304, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:44:14 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:14.629 104189 INFO neutron.agent.ovn.metadata.agent [-] Port a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c in datapath e17082c0-6f0c-461a-a787-3e29d33c0965 bound to our chassis
Jan 12 13:44:14 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:14.630 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e17082c0-6f0c-461a-a787-3e29d33c0965
Jan 12 13:44:14 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:14.631 104189 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmprni5_e4t/privsep.sock']
Jan 12 13:44:14 compute-0 systemd-udevd[209911]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:44:14 compute-0 NetworkManager[55211]: <info>  [1768225454.6486] device (tapa2325a00-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:44:14 compute-0 NetworkManager[55211]: <info>  [1768225454.6492] device (tapa2325a00-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:44:14 compute-0 systemd-machined[153581]: New machine qemu-1-instance-00000001.
Jan 12 13:44:14 compute-0 nova_compute[181978]: 2026-01-12 13:44:14.698 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:14 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Jan 12 13:44:14 compute-0 ovn_controller[94974]: 2026-01-12T13:44:14Z|00029|binding|INFO|Setting lport a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c ovn-installed in OVS
Jan 12 13:44:14 compute-0 ovn_controller[94974]: 2026-01-12T13:44:14Z|00030|binding|INFO|Setting lport a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c up in Southbound
Jan 12 13:44:14 compute-0 nova_compute[181978]: 2026-01-12 13:44:14.705 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.143 104189 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.144 104189 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmprni5_e4t/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.072 209930 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.075 209930 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.076 209930 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.077 209930 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209930
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.146 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[45d671d3-945f-45ee-bcbf-55aca91d4276]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.380 181991 DEBUG nova.network.neutron [req-399e4237-8922-4dfa-8a4a-c9c3bc4c7641 req-1f84e129-f4c2-4ec2-993d-1c0c2c1d817a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Updated VIF entry in instance network info cache for port a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.380 181991 DEBUG nova.network.neutron [req-399e4237-8922-4dfa-8a4a-c9c3bc4c7641 req-1f84e129-f4c2-4ec2-993d-1c0c2c1d817a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Updating instance_info_cache with network_info: [{"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.395 181991 DEBUG oslo_concurrency.lockutils [req-399e4237-8922-4dfa-8a4a-c9c3bc4c7641 req-1f84e129-f4c2-4ec2-993d-1c0c2c1d817a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.501 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.502 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.502 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225455.5011654, 9828d316-7b89-422e-a561-fad4ab8d9a5a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.502 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] VM Started (Lifecycle Event)
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.503 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.538 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.540 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225455.5013025, 9828d316-7b89-422e-a561-fad4ab8d9a5a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.541 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] VM Paused (Lifecycle Event)
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.554 181991 DEBUG nova.compute.manager [req-34b82175-bbd1-4958-935f-cc64577e0ad7 req-f3931a29-4942-4151-9780-264c8c9685a0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received event network-vif-plugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.554 181991 DEBUG oslo_concurrency.lockutils [req-34b82175-bbd1-4958-935f-cc64577e0ad7 req-f3931a29-4942-4151-9780-264c8c9685a0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.555 181991 DEBUG oslo_concurrency.lockutils [req-34b82175-bbd1-4958-935f-cc64577e0ad7 req-f3931a29-4942-4151-9780-264c8c9685a0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.555 181991 DEBUG oslo_concurrency.lockutils [req-34b82175-bbd1-4958-935f-cc64577e0ad7 req-f3931a29-4942-4151-9780-264c8c9685a0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.555 181991 DEBUG nova.compute.manager [req-34b82175-bbd1-4958-935f-cc64577e0ad7 req-f3931a29-4942-4151-9780-264c8c9685a0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Processing event network-vif-plugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.556 181991 DEBUG nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.556 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.561 209930 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.564 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.561 209930 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:15.561 209930 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.565 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225455.5639448, 9828d316-7b89-422e-a561-fad4ab8d9a5a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.565 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] VM Resumed (Lifecycle Event)
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.567 181991 INFO nova.virt.libvirt.driver [-] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Instance spawned successfully.
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.567 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.582 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.584 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.611 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.635 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.635 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.635 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.636 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.636 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.637 181991 DEBUG nova.virt.libvirt.driver [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.684 181991 INFO nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Took 6.22 seconds to spawn the instance on the hypervisor.
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.688 181991 DEBUG nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.748 181991 INFO nova.compute.manager [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Took 6.56 seconds to build instance.
Jan 12 13:44:15 compute-0 nova_compute[181978]: 2026-01-12 13:44:15.762 181991 DEBUG oslo_concurrency.lockutils [None req-271f0c16-bb78-4676-93a9-08f53007d68c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.027 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[adf00598-316a-497a-86a7-9aec106fcdb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.028 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape17082c0-61 in ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.029 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape17082c0-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.029 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[dac23dab-89f5-4f2b-9334-6cd37c6ee9d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.032 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[0acace81-1dc8-41e5-bf3b-7076ed9562c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.052 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[068ecbef-9662-45d7-9471-d597884aaa5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.075 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd3d5ff-ef54-4137-aca3-a5890df1c56c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.077 104189 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpj_xlfiri/privsep.sock']
Jan 12 13:44:16 compute-0 podman[209946]: 2026-01-12 13:44:16.132653848 +0000 UTC m=+0.065498771 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.481 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.481 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.501 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.502 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.502 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.502 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.560 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.614 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.615 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.659 104189 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.660 104189 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpj_xlfiri/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.566 209970 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.569 209970 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.571 209970 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.571 209970 INFO oslo.privsep.daemon [-] privsep daemon running as pid 209970
Jan 12 13:44:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:16.662 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[15d29142-2aeb-4106-8eb8-82c8d30a8dbf]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.667 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.904 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.907 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5704MB free_disk=73.38414764404297GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.907 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.908 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.973 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Instance 9828d316-7b89-422e-a561-fad4ab8d9a5a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.974 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:44:16 compute-0 nova_compute[181978]: 2026-01-12 13:44:16.974 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.015 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Updating inventory in ProviderTree for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.040 181991 ERROR nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [req-3798210a-c242-4713-bdb4-250e878ec2ea] Failed to update inventory to [{'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 5f3fe3a8-f640-4221-8f9a-71aa07eebe17.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-3798210a-c242-4713-bdb4-250e878ec2ea"}]}
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.055 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Refreshing inventories for resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.069 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Updating ProviderTree inventory for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.070 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Updating inventory in ProviderTree for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.088 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Refreshing aggregate associations for resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.105 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Refreshing trait associations for resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AVX512VPCLMULQDQ,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,HW_CPU_X86_AVX512VAES,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.129 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.142 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Updating inventory in ProviderTree for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.144 209970 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.144 209970 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.144 209970 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.187 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Updated inventory for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.187 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Updating resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.188 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Updating inventory in ProviderTree for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.209 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.210 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.622 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[16385a2e-0fc9-4ce8-8fb9-dd01cdb3e716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.627 181991 DEBUG nova.compute.manager [req-c0394085-fc81-4af8-8419-2583d4aa6628 req-003f6cb2-45b8-479c-8545-9e38b9c8cfe4 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received event network-vif-plugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.628 181991 DEBUG oslo_concurrency.lockutils [req-c0394085-fc81-4af8-8419-2583d4aa6628 req-003f6cb2-45b8-479c-8545-9e38b9c8cfe4 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.628 181991 DEBUG oslo_concurrency.lockutils [req-c0394085-fc81-4af8-8419-2583d4aa6628 req-003f6cb2-45b8-479c-8545-9e38b9c8cfe4 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.628 181991 DEBUG oslo_concurrency.lockutils [req-c0394085-fc81-4af8-8419-2583d4aa6628 req-003f6cb2-45b8-479c-8545-9e38b9c8cfe4 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.628 181991 DEBUG nova.compute.manager [req-c0394085-fc81-4af8-8419-2583d4aa6628 req-003f6cb2-45b8-479c-8545-9e38b9c8cfe4 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] No waiting events found dispatching network-vif-plugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.629 181991 WARNING nova.compute.manager [req-c0394085-fc81-4af8-8419-2583d4aa6628 req-003f6cb2-45b8-479c-8545-9e38b9c8cfe4 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received unexpected event network-vif-plugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c for instance with vm_state active and task_state None.
Jan 12 13:44:17 compute-0 NetworkManager[55211]: <info>  [1768225457.6519] manager: (tape17082c0-60): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.653 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[50ad1971-b35d-4db0-91fe-e6050ce888db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 systemd-udevd[209987]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.676 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[03228079-27a8-4218-b10d-3fee8b5222c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.679 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4e3034-71e9-454e-a9ff-050d9e5e656f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 NetworkManager[55211]: <info>  [1768225457.7025] device (tape17082c0-60): carrier: link connected
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.709 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0162a0-0006-4af7-b779-3f4ad723cc2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.725 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[24993a62-66a7-4127-8a12-7bec8238df8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape17082c0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:3e:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 252372, 'reachable_time': 34916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209997, 'error': None, 'target': 'ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.737 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[16a25c87-5880-4db3-90fb-b49f6462f7d0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:3eed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 252372, 'tstamp': 252372}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209998, 'error': None, 'target': 'ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.750 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[907fb971-1ac6-40e3-a8c3-9219dd6cc8e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape17082c0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:3e:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 252372, 'reachable_time': 34916, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210000, 'error': None, 'target': 'ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.770 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[30f45d04-e29e-41f5-ab1d-626fa1c5d1e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.815 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[0f4a3dd0-aa84-4c23-9bdd-c9f6029f45c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.816 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape17082c0-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.817 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.817 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape17082c0-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:17 compute-0 kernel: tape17082c0-60: entered promiscuous mode
Jan 12 13:44:17 compute-0 NetworkManager[55211]: <info>  [1768225457.8193] manager: (tape17082c0-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.822 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape17082c0-60, col_values=(('external_ids', {'iface-id': 'a3bb26f6-67f0-4092-965f-456ffbc02ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.823 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:17 compute-0 ovn_controller[94974]: 2026-01-12T13:44:17Z|00031|binding|INFO|Releasing lport a3bb26f6-67f0-4092-965f-456ffbc02ff3 from this chassis (sb_readonly=0)
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.827 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e17082c0-6f0c-461a-a787-3e29d33c0965.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e17082c0-6f0c-461a-a787-3e29d33c0965.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.828 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[9d04a781-8d58-45ae-ac06-b8979c940efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.829 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-e17082c0-6f0c-461a-a787-3e29d33c0965
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/e17082c0-6f0c-461a-a787-3e29d33c0965.pid.haproxy
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID e17082c0-6f0c-461a-a787-3e29d33c0965
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:44:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:17.830 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965', 'env', 'PROCESS_TAG=haproxy-e17082c0-6f0c-461a-a787-3e29d33c0965', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e17082c0-6f0c-461a-a787-3e29d33c0965.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:44:17 compute-0 nova_compute[181978]: 2026-01-12 13:44:17.835 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:18 compute-0 podman[210029]: 2026-01-12 13:44:18.116828834 +0000 UTC m=+0.039527418 container create 1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:44:18 compute-0 systemd[1]: Started libpod-conmon-1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5.scope.
Jan 12 13:44:18 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:44:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c41b9f9904f5f6e6a2e63fac0cd7b1eac772e579467896fc8f33f5b15ffea0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:44:18 compute-0 podman[210029]: 2026-01-12 13:44:18.189269739 +0000 UTC m=+0.111968343 container init 1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 12 13:44:18 compute-0 podman[210029]: 2026-01-12 13:44:18.194181206 +0000 UTC m=+0.116879791 container start 1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 12 13:44:18 compute-0 podman[210029]: 2026-01-12 13:44:18.098988392 +0000 UTC m=+0.021686997 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:44:18 compute-0 nova_compute[181978]: 2026-01-12 13:44:18.206 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:18 compute-0 neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965[210041]: [NOTICE]   (210045) : New worker (210047) forked
Jan 12 13:44:18 compute-0 neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965[210041]: [NOTICE]   (210045) : Loading success.
Jan 12 13:44:18 compute-0 nova_compute[181978]: 2026-01-12 13:44:18.221 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:18 compute-0 nova_compute[181978]: 2026-01-12 13:44:18.251 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:18 compute-0 nova_compute[181978]: 2026-01-12 13:44:18.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:18 compute-0 nova_compute[181978]: 2026-01-12 13:44:18.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:44:18 compute-0 nova_compute[181978]: 2026-01-12 13:44:18.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:44:18 compute-0 ovn_controller[94974]: 2026-01-12T13:44:18Z|00032|binding|INFO|Releasing lport a3bb26f6-67f0-4092-965f-456ffbc02ff3 from this chassis (sb_readonly=0)
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <info>  [1768225458.5107] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <info>  [1768225458.5113] device (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <warn>  [1768225458.5114] device (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <info>  [1768225458.5122] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <info>  [1768225458.5125] device (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <warn>  [1768225458.5125] device (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <info>  [1768225458.5131] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <info>  [1768225458.5141] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <info>  [1768225458.5145] device (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 12 13:44:18 compute-0 NetworkManager[55211]: <info>  [1768225458.5149] device (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 12 13:44:18 compute-0 nova_compute[181978]: 2026-01-12 13:44:18.510 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:18 compute-0 ovn_controller[94974]: 2026-01-12T13:44:18Z|00033|binding|INFO|Releasing lport a3bb26f6-67f0-4092-965f-456ffbc02ff3 from this chassis (sb_readonly=0)
Jan 12 13:44:18 compute-0 nova_compute[181978]: 2026-01-12 13:44:18.541 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:18 compute-0 nova_compute[181978]: 2026-01-12 13:44:18.544 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:19 compute-0 nova_compute[181978]: 2026-01-12 13:44:19.684 181991 DEBUG nova.compute.manager [req-132903b5-65a6-417b-8068-0ff1b1c71d35 req-a166b193-5259-4e33-9a43-35c8b1cddc6f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received event network-changed-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:44:19 compute-0 nova_compute[181978]: 2026-01-12 13:44:19.684 181991 DEBUG nova.compute.manager [req-132903b5-65a6-417b-8068-0ff1b1c71d35 req-a166b193-5259-4e33-9a43-35c8b1cddc6f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Refreshing instance network info cache due to event network-changed-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:44:19 compute-0 nova_compute[181978]: 2026-01-12 13:44:19.684 181991 DEBUG oslo_concurrency.lockutils [req-132903b5-65a6-417b-8068-0ff1b1c71d35 req-a166b193-5259-4e33-9a43-35c8b1cddc6f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:44:19 compute-0 nova_compute[181978]: 2026-01-12 13:44:19.684 181991 DEBUG oslo_concurrency.lockutils [req-132903b5-65a6-417b-8068-0ff1b1c71d35 req-a166b193-5259-4e33-9a43-35c8b1cddc6f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:44:19 compute-0 nova_compute[181978]: 2026-01-12 13:44:19.685 181991 DEBUG nova.network.neutron [req-132903b5-65a6-417b-8068-0ff1b1c71d35 req-a166b193-5259-4e33-9a43-35c8b1cddc6f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Refreshing network info cache for port a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:44:20 compute-0 nova_compute[181978]: 2026-01-12 13:44:20.782 181991 DEBUG nova.network.neutron [req-132903b5-65a6-417b-8068-0ff1b1c71d35 req-a166b193-5259-4e33-9a43-35c8b1cddc6f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Updated VIF entry in instance network info cache for port a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:44:20 compute-0 nova_compute[181978]: 2026-01-12 13:44:20.783 181991 DEBUG nova.network.neutron [req-132903b5-65a6-417b-8068-0ff1b1c71d35 req-a166b193-5259-4e33-9a43-35c8b1cddc6f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Updating instance_info_cache with network_info: [{"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:44:20 compute-0 nova_compute[181978]: 2026-01-12 13:44:20.797 181991 DEBUG oslo_concurrency.lockutils [req-132903b5-65a6-417b-8068-0ff1b1c71d35 req-a166b193-5259-4e33-9a43-35c8b1cddc6f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:44:22 compute-0 nova_compute[181978]: 2026-01-12 13:44:22.130 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:23 compute-0 nova_compute[181978]: 2026-01-12 13:44:23.252 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:25 compute-0 podman[210060]: 2026-01-12 13:44:25.539560735 +0000 UTC m=+0.036112095 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 12 13:44:26 compute-0 ovn_controller[94974]: 2026-01-12T13:44:26Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:18:95:cb 10.100.0.12
Jan 12 13:44:26 compute-0 ovn_controller[94974]: 2026-01-12T13:44:26Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:18:95:cb 10.100.0.12
Jan 12 13:44:27 compute-0 nova_compute[181978]: 2026-01-12 13:44:27.132 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:28 compute-0 nova_compute[181978]: 2026-01-12 13:44:28.255 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:29 compute-0 podman[210088]: 2026-01-12 13:44:29.541952511 +0000 UTC m=+0.036630892 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 12 13:44:32 compute-0 nova_compute[181978]: 2026-01-12 13:44:32.135 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:33 compute-0 nova_compute[181978]: 2026-01-12 13:44:33.257 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:33 compute-0 nova_compute[181978]: 2026-01-12 13:44:33.429 181991 INFO nova.compute.manager [None req-b1b4a66e-03ab-4a8d-b8da-8982bdcba07f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Get console output
Jan 12 13:44:33 compute-0 nova_compute[181978]: 2026-01-12 13:44:33.498 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:44:37 compute-0 nova_compute[181978]: 2026-01-12 13:44:37.135 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:38 compute-0 nova_compute[181978]: 2026-01-12 13:44:38.258 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:38 compute-0 podman[210106]: 2026-01-12 13:44:38.58756365 +0000 UTC m=+0.080667783 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:44:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:40.197 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:40.198 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:40.198 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:40 compute-0 podman[210129]: 2026-01-12 13:44:40.536013696 +0000 UTC m=+0.032486574 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:44:40 compute-0 podman[210130]: 2026-01-12 13:44:40.5493899 +0000 UTC m=+0.044210711 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.136 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.436 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.436 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.450 181991 DEBUG nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.518 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.519 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.524 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.524 181991 INFO nova.compute.claims [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.620 181991 DEBUG nova.compute.provider_tree [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.631 181991 DEBUG nova.scheduler.client.report [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.643 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.644 181991 DEBUG nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.686 181991 DEBUG nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.686 181991 DEBUG nova.network.neutron [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.701 181991 INFO nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.712 181991 DEBUG nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.773 181991 DEBUG nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.774 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.774 181991 INFO nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Creating image(s)
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.775 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.775 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.775 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.785 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.826 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.827 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.828 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.837 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.870 181991 DEBUG nova.policy [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.879 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.880 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.898 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.899 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.900 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.942 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.943 181991 DEBUG nova.virt.disk.api [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.943 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.984 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.985 181991 DEBUG nova.virt.disk.api [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.985 181991 DEBUG nova.objects.instance [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid a75f5a44-2ce8-42d7-97fb-0a18198794ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.995 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.995 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Ensure instance console log exists: /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.996 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.996 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:42 compute-0 nova_compute[181978]: 2026-01-12 13:44:42.996 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:43 compute-0 nova_compute[181978]: 2026-01-12 13:44:43.260 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:43 compute-0 nova_compute[181978]: 2026-01-12 13:44:43.478 181991 DEBUG nova.network.neutron [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Successfully created port: 046c8a5e-9569-4f66-8722-ff77b7dbf6af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:44:44 compute-0 nova_compute[181978]: 2026-01-12 13:44:44.070 181991 DEBUG nova.network.neutron [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Successfully updated port: 046c8a5e-9569-4f66-8722-ff77b7dbf6af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:44:44 compute-0 nova_compute[181978]: 2026-01-12 13:44:44.083 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-a75f5a44-2ce8-42d7-97fb-0a18198794ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:44:44 compute-0 nova_compute[181978]: 2026-01-12 13:44:44.083 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-a75f5a44-2ce8-42d7-97fb-0a18198794ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:44:44 compute-0 nova_compute[181978]: 2026-01-12 13:44:44.083 181991 DEBUG nova.network.neutron [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:44:44 compute-0 nova_compute[181978]: 2026-01-12 13:44:44.426 181991 DEBUG nova.compute.manager [req-28abdcdb-3f27-442b-86fe-f39d9ec8f2c1 req-86fb9c50-beba-4b74-ad82-75cb2f118304 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-changed-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:44:44 compute-0 nova_compute[181978]: 2026-01-12 13:44:44.427 181991 DEBUG nova.compute.manager [req-28abdcdb-3f27-442b-86fe-f39d9ec8f2c1 req-86fb9c50-beba-4b74-ad82-75cb2f118304 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Refreshing instance network info cache due to event network-changed-046c8a5e-9569-4f66-8722-ff77b7dbf6af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:44:44 compute-0 nova_compute[181978]: 2026-01-12 13:44:44.427 181991 DEBUG oslo_concurrency.lockutils [req-28abdcdb-3f27-442b-86fe-f39d9ec8f2c1 req-86fb9c50-beba-4b74-ad82-75cb2f118304 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-a75f5a44-2ce8-42d7-97fb-0a18198794ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:44:45 compute-0 nova_compute[181978]: 2026-01-12 13:44:45.176 181991 DEBUG nova.network.neutron [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:44:46 compute-0 podman[210183]: 2026-01-12 13:44:46.542712967 +0000 UTC m=+0.038084195 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 12 13:44:47 compute-0 nova_compute[181978]: 2026-01-12 13:44:47.139 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.179 181991 DEBUG nova.network.neutron [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Updating instance_info_cache with network_info: [{"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.192 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-a75f5a44-2ce8-42d7-97fb-0a18198794ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.192 181991 DEBUG nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Instance network_info: |[{"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.193 181991 DEBUG oslo_concurrency.lockutils [req-28abdcdb-3f27-442b-86fe-f39d9ec8f2c1 req-86fb9c50-beba-4b74-ad82-75cb2f118304 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-a75f5a44-2ce8-42d7-97fb-0a18198794ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.193 181991 DEBUG nova.network.neutron [req-28abdcdb-3f27-442b-86fe-f39d9ec8f2c1 req-86fb9c50-beba-4b74-ad82-75cb2f118304 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Refreshing network info cache for port 046c8a5e-9569-4f66-8722-ff77b7dbf6af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.195 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Start _get_guest_xml network_info=[{"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.197 181991 WARNING nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.201 181991 DEBUG nova.virt.libvirt.host [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.202 181991 DEBUG nova.virt.libvirt.host [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.206 181991 DEBUG nova.virt.libvirt.host [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.207 181991 DEBUG nova.virt.libvirt.host [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.207 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.207 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.208 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.208 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.208 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.208 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.208 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.209 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.209 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.209 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.209 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.209 181991 DEBUG nova.virt.hardware [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.212 181991 DEBUG nova.virt.libvirt.vif [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:44:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-881283916',display_name='tempest-TestNetworkBasicOps-server-881283916',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-881283916',id=2,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8VXsWOgksxgSgZGOO55gaORCF3OlcQ+IPqaCCnCta/+H8teqO4aQ9Z/d8/GOEzwFb3jL331i/9/p3enwGNWusccmSiQgSzzYrYKfQ3maOnwB9S22f9FoR84rXMgBhu2Q==',key_name='tempest-TestNetworkBasicOps-1328257164',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-01t2hs0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:44:42Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a75f5a44-2ce8-42d7-97fb-0a18198794ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.212 181991 DEBUG nova.network.os_vif_util [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.213 181991 DEBUG nova.network.os_vif_util [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:ea:5a,bridge_name='br-int',has_traffic_filtering=True,id=046c8a5e-9569-4f66-8722-ff77b7dbf6af,network=Network(d644de00-cee9-4519-857b-9315d08f3a6b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046c8a5e-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.213 181991 DEBUG nova.objects.instance [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid a75f5a44-2ce8-42d7-97fb-0a18198794ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.223 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <uuid>a75f5a44-2ce8-42d7-97fb-0a18198794ef</uuid>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <name>instance-00000002</name>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-881283916</nova:name>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:44:48</nova:creationTime>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:44:48 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:44:48 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:44:48 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:44:48 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:44:48 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:44:48 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:44:48 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:44:48 compute-0 nova_compute[181978]:         <nova:port uuid="046c8a5e-9569-4f66-8722-ff77b7dbf6af">
Jan 12 13:44:48 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <system>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <entry name="serial">a75f5a44-2ce8-42d7-97fb-0a18198794ef</entry>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <entry name="uuid">a75f5a44-2ce8-42d7-97fb-0a18198794ef</entry>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     </system>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <os>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   </os>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <features>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   </features>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.config"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:fa:ea:5a"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <target dev="tap046c8a5e-95"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/console.log" append="off"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <video>
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     </video>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:44:48 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:44:48 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:44:48 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:44:48 compute-0 nova_compute[181978]: </domain>
Jan 12 13:44:48 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.224 181991 DEBUG nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Preparing to wait for external event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.224 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.224 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.224 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.225 181991 DEBUG nova.virt.libvirt.vif [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:44:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-881283916',display_name='tempest-TestNetworkBasicOps-server-881283916',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-881283916',id=2,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8VXsWOgksxgSgZGOO55gaORCF3OlcQ+IPqaCCnCta/+H8teqO4aQ9Z/d8/GOEzwFb3jL331i/9/p3enwGNWusccmSiQgSzzYrYKfQ3maOnwB9S22f9FoR84rXMgBhu2Q==',key_name='tempest-TestNetworkBasicOps-1328257164',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-01t2hs0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:44:42Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a75f5a44-2ce8-42d7-97fb-0a18198794ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.225 181991 DEBUG nova.network.os_vif_util [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.225 181991 DEBUG nova.network.os_vif_util [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:ea:5a,bridge_name='br-int',has_traffic_filtering=True,id=046c8a5e-9569-4f66-8722-ff77b7dbf6af,network=Network(d644de00-cee9-4519-857b-9315d08f3a6b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046c8a5e-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.226 181991 DEBUG os_vif [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:ea:5a,bridge_name='br-int',has_traffic_filtering=True,id=046c8a5e-9569-4f66-8722-ff77b7dbf6af,network=Network(d644de00-cee9-4519-857b-9315d08f3a6b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046c8a5e-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.226 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.226 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.227 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.228 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.229 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap046c8a5e-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.229 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap046c8a5e-95, col_values=(('external_ids', {'iface-id': '046c8a5e-9569-4f66-8722-ff77b7dbf6af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:ea:5a', 'vm-uuid': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:48 compute-0 NetworkManager[55211]: <info>  [1768225488.2311] manager: (tap046c8a5e-95): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.230 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.233 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.235 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.236 181991 INFO os_vif [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:ea:5a,bridge_name='br-int',has_traffic_filtering=True,id=046c8a5e-9569-4f66-8722-ff77b7dbf6af,network=Network(d644de00-cee9-4519-857b-9315d08f3a6b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046c8a5e-95')
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.272 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.272 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.273 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:fa:ea:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.273 181991 INFO nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Using config drive
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.820 181991 INFO nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Creating config drive at /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.config
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.824 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjb8hd2ix execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.941 181991 DEBUG oslo_concurrency.processutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjb8hd2ix" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:44:48 compute-0 kernel: tap046c8a5e-95: entered promiscuous mode
Jan 12 13:44:48 compute-0 NetworkManager[55211]: <info>  [1768225488.9731] manager: (tap046c8a5e-95): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.974 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:48 compute-0 ovn_controller[94974]: 2026-01-12T13:44:48Z|00034|binding|INFO|Claiming lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af for this chassis.
Jan 12 13:44:48 compute-0 ovn_controller[94974]: 2026-01-12T13:44:48Z|00035|binding|INFO|046c8a5e-9569-4f66-8722-ff77b7dbf6af: Claiming fa:16:3e:fa:ea:5a 10.100.0.22
Jan 12 13:44:48 compute-0 nova_compute[181978]: 2026-01-12 13:44:48.978 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:48 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:48.987 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:ea:5a 10.100.0.22'], port_security=['fa:16:3e:fa:ea:5a 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d644de00-cee9-4519-857b-9315d08f3a6b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99cc2e72-a1a5-4ad5-83b1-2112f5ce8e09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cbc8ad1-eb42-4a9d-8275-a3ca6b307017, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=046c8a5e-9569-4f66-8722-ff77b7dbf6af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:44:48 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:48.988 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 046c8a5e-9569-4f66-8722-ff77b7dbf6af in datapath d644de00-cee9-4519-857b-9315d08f3a6b bound to our chassis
Jan 12 13:44:48 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:48.989 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d644de00-cee9-4519-857b-9315d08f3a6b
Jan 12 13:44:48 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:48.997 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ebdc8fd2-7533-414c-b35d-511791e6625a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:48 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:48.998 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd644de00-c1 in ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:48.999 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd644de00-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:48.999 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c9b063-7ee5-4f6c-8e70-adea656acd27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.000 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[0277babd-9b4b-4c82-ac5c-b0fa33dfb451]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 systemd-udevd[210218]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:44:49 compute-0 NetworkManager[55211]: <info>  [1768225489.0132] device (tap046c8a5e-95): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:44:49 compute-0 NetworkManager[55211]: <info>  [1768225489.0136] device (tap046c8a5e-95): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:44:49 compute-0 ovn_controller[94974]: 2026-01-12T13:44:49Z|00036|binding|INFO|Setting lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af ovn-installed in OVS
Jan 12 13:44:49 compute-0 ovn_controller[94974]: 2026-01-12T13:44:49Z|00037|binding|INFO|Setting lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af up in Southbound
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.014 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.018 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.021 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[05760bbd-8693-4edf-8e2d-149230a7cbea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 systemd-machined[153581]: New machine qemu-2-instance-00000002.
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.032 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[eb506990-008d-4fa0-aa09-ae7a8a67a2e5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.050 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[f2107c91-e817-439a-9efd-35f48d2e24a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 NetworkManager[55211]: <info>  [1768225489.0539] manager: (tapd644de00-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.053 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[85f45cd3-2d2f-4e40-a5d1-25b37413498f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 systemd-udevd[210222]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.074 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[f746104c-e46a-4405-a6df-367f5c5dbbdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.076 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[f93a184d-4899-4712-b5fa-09207a286634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 NetworkManager[55211]: <info>  [1768225489.0918] device (tapd644de00-c0): carrier: link connected
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.096 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[060e1574-8dc6-4159-8720-fb8248140cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.108 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[11ffb1da-65fb-46bc-8b4d-172ed602a670]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd644de00-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:c4:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 255511, 'reachable_time': 26122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210244, 'error': None, 'target': 'ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.117 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef8ebc8-659e-4235-a93f-9b304f25a38d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7a:c4ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 255511, 'tstamp': 255511}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210245, 'error': None, 'target': 'ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.130 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[37afc6c7-fdbc-4e3a-b944-2d5e439577ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd644de00-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:c4:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 255511, 'reachable_time': 26122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210246, 'error': None, 'target': 'ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.149 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[577f4587-95d3-477d-86e6-21759635d474]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.162 181991 DEBUG nova.compute.manager [req-0019a609-9911-4d1d-8fa0-51cb36a14f1b req-5840bc32-c2de-48ba-8c12-92e3788a8206 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.163 181991 DEBUG oslo_concurrency.lockutils [req-0019a609-9911-4d1d-8fa0-51cb36a14f1b req-5840bc32-c2de-48ba-8c12-92e3788a8206 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.163 181991 DEBUG oslo_concurrency.lockutils [req-0019a609-9911-4d1d-8fa0-51cb36a14f1b req-5840bc32-c2de-48ba-8c12-92e3788a8206 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.163 181991 DEBUG oslo_concurrency.lockutils [req-0019a609-9911-4d1d-8fa0-51cb36a14f1b req-5840bc32-c2de-48ba-8c12-92e3788a8206 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.164 181991 DEBUG nova.compute.manager [req-0019a609-9911-4d1d-8fa0-51cb36a14f1b req-5840bc32-c2de-48ba-8c12-92e3788a8206 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Processing event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.193 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2e0e0c-7093-4d4b-8aad-e1e829bd6a59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.194 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd644de00-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.195 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.195 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd644de00-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.196 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:49 compute-0 NetworkManager[55211]: <info>  [1768225489.1970] manager: (tapd644de00-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 12 13:44:49 compute-0 kernel: tapd644de00-c0: entered promiscuous mode
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.201 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd644de00-c0, col_values=(('external_ids', {'iface-id': '4c5a6bef-f379-46b2-94bd-5273309fb73e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.202 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:49 compute-0 ovn_controller[94974]: 2026-01-12T13:44:49Z|00038|binding|INFO|Releasing lport 4c5a6bef-f379-46b2-94bd-5273309fb73e from this chassis (sb_readonly=0)
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.205 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d644de00-cee9-4519-857b-9315d08f3a6b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d644de00-cee9-4519-857b-9315d08f3a6b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.214 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7cddc7-88b5-4cf9-a9d9-87276b32630c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.215 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-d644de00-cee9-4519-857b-9315d08f3a6b
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/d644de00-cee9-4519-857b-9315d08f3a6b.pid.haproxy
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID d644de00-cee9-4519-857b-9315d08f3a6b
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.217 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:49.218 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b', 'env', 'PROCESS_TAG=haproxy-d644de00-cee9-4519-857b-9315d08f3a6b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d644de00-cee9-4519-857b-9315d08f3a6b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.285 181991 DEBUG nova.network.neutron [req-28abdcdb-3f27-442b-86fe-f39d9ec8f2c1 req-86fb9c50-beba-4b74-ad82-75cb2f118304 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Updated VIF entry in instance network info cache for port 046c8a5e-9569-4f66-8722-ff77b7dbf6af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.286 181991 DEBUG nova.network.neutron [req-28abdcdb-3f27-442b-86fe-f39d9ec8f2c1 req-86fb9c50-beba-4b74-ad82-75cb2f118304 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Updating instance_info_cache with network_info: [{"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:44:49 compute-0 nova_compute[181978]: 2026-01-12 13:44:49.299 181991 DEBUG oslo_concurrency.lockutils [req-28abdcdb-3f27-442b-86fe-f39d9ec8f2c1 req-86fb9c50-beba-4b74-ad82-75cb2f118304 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-a75f5a44-2ce8-42d7-97fb-0a18198794ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:44:49 compute-0 podman[210274]: 2026-01-12 13:44:49.489083968 +0000 UTC m=+0.030713330 container create 22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 12 13:44:49 compute-0 systemd[1]: Started libpod-conmon-22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071.scope.
Jan 12 13:44:49 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:44:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54f9c26296d2ba2fe3a1b8e8f043dc6976830a67acd46f3ccb7c49e7af0cb4c2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:44:49 compute-0 podman[210274]: 2026-01-12 13:44:49.543334495 +0000 UTC m=+0.084963866 container init 22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 12 13:44:49 compute-0 podman[210274]: 2026-01-12 13:44:49.547572418 +0000 UTC m=+0.089201789 container start 22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:44:49 compute-0 podman[210274]: 2026-01-12 13:44:49.475941385 +0000 UTC m=+0.017570756 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:44:49 compute-0 neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b[210286]: [NOTICE]   (210290) : New worker (210292) forked
Jan 12 13:44:49 compute-0 neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b[210286]: [NOTICE]   (210290) : Loading success.
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.809 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225490.8089397, a75f5a44-2ce8-42d7-97fb-0a18198794ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.810 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] VM Started (Lifecycle Event)
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.812 181991 DEBUG nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.814 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.816 181991 INFO nova.virt.libvirt.driver [-] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Instance spawned successfully.
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.816 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.827 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.831 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.833 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.834 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.834 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.835 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.835 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.835 181991 DEBUG nova.virt.libvirt.driver [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.855 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.856 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225490.8096905, a75f5a44-2ce8-42d7-97fb-0a18198794ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.856 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] VM Paused (Lifecycle Event)
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.882 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.885 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225490.8137662, a75f5a44-2ce8-42d7-97fb-0a18198794ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.885 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] VM Resumed (Lifecycle Event)
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.900 181991 INFO nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Took 8.13 seconds to spawn the instance on the hypervisor.
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.900 181991 DEBUG nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.901 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.905 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.927 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.947 181991 INFO nova.compute.manager [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Took 8.46 seconds to build instance.
Jan 12 13:44:50 compute-0 nova_compute[181978]: 2026-01-12 13:44:50.957 181991 DEBUG oslo_concurrency.lockutils [None req-f004c3c1-48aa-4624-ac46-fb9f5d172eed d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:51 compute-0 nova_compute[181978]: 2026-01-12 13:44:51.234 181991 DEBUG nova.compute.manager [req-2350ed50-c86b-4c4f-bfc7-97e2bade290b req-f591493f-a6ac-49e0-814e-26407568eb03 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:44:51 compute-0 nova_compute[181978]: 2026-01-12 13:44:51.236 181991 DEBUG oslo_concurrency.lockutils [req-2350ed50-c86b-4c4f-bfc7-97e2bade290b req-f591493f-a6ac-49e0-814e-26407568eb03 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:44:51 compute-0 nova_compute[181978]: 2026-01-12 13:44:51.236 181991 DEBUG oslo_concurrency.lockutils [req-2350ed50-c86b-4c4f-bfc7-97e2bade290b req-f591493f-a6ac-49e0-814e-26407568eb03 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:44:51 compute-0 nova_compute[181978]: 2026-01-12 13:44:51.236 181991 DEBUG oslo_concurrency.lockutils [req-2350ed50-c86b-4c4f-bfc7-97e2bade290b req-f591493f-a6ac-49e0-814e-26407568eb03 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:44:51 compute-0 nova_compute[181978]: 2026-01-12 13:44:51.236 181991 DEBUG nova.compute.manager [req-2350ed50-c86b-4c4f-bfc7-97e2bade290b req-f591493f-a6ac-49e0-814e-26407568eb03 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] No waiting events found dispatching network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:44:51 compute-0 nova_compute[181978]: 2026-01-12 13:44:51.237 181991 WARNING nova.compute.manager [req-2350ed50-c86b-4c4f-bfc7-97e2bade290b req-f591493f-a6ac-49e0-814e-26407568eb03 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received unexpected event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af for instance with vm_state active and task_state None.
Jan 12 13:44:52 compute-0 nova_compute[181978]: 2026-01-12 13:44:52.142 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:53 compute-0 nova_compute[181978]: 2026-01-12 13:44:53.231 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.442 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}25aea31de428ba43f8027bcd725cca8283e9d6355428984652ae76557961dc5b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.514 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Mon, 12 Jan 2026 13:44:55 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-caf84b46-a83b-4d73-b3d4-2f773890d3f9 x-openstack-request-id: req-caf84b46-a83b-4d73-b3d4-2f773890d3f9 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.515 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "0bbd7717-2f21-486b-811b-14d24452f9a6", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/0bbd7717-2f21-486b-811b-14d24452f9a6"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/0bbd7717-2f21-486b-811b-14d24452f9a6"}]}, {"id": "268e90b0-b401-40b1-a726-56e7d47ec0eb", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/268e90b0-b401-40b1-a726-56e7d47ec0eb"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/268e90b0-b401-40b1-a726-56e7d47ec0eb"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.515 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-caf84b46-a83b-4d73-b3d4-2f773890d3f9 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.516 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/0bbd7717-2f21-486b-811b-14d24452f9a6 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}25aea31de428ba43f8027bcd725cca8283e9d6355428984652ae76557961dc5b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.586 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Mon, 12 Jan 2026 13:44:55 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-03af25c0-ed24-4c56-9e3d-2a4b046fff74 x-openstack-request-id: req-03af25c0-ed24-4c56-9e3d-2a4b046fff74 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.586 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "0bbd7717-2f21-486b-811b-14d24452f9a6", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/0bbd7717-2f21-486b-811b-14d24452f9a6"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/0bbd7717-2f21-486b-811b-14d24452f9a6"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.586 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/0bbd7717-2f21-486b-811b-14d24452f9a6 used request id req-03af25c0-ed24-4c56-9e3d-2a4b046fff74 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.587 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'name': 'tempest-TestNetworkBasicOps-server-881283916', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c978298f864c4039b47e09202eaf780c', 'user_id': 'd4158a3958504a578730a6b3561138ce', 'hostId': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.589 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'name': 'tempest-TestNetworkBasicOps-server-199675950', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c978298f864c4039b47e09202eaf780c', 'user_id': 'd4158a3958504a578730a6b3561138ce', 'hostId': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.589 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.591 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a75f5a44-2ce8-42d7-97fb-0a18198794ef / tap046c8a5e-95 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.591 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.593 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9828d316-7b89-422e-a561-fad4ab8d9a5a / tapa2325a00-e1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.593 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fb57ef2-173f-4bd4-8e3d-ffe91943b706', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.589386', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e0fae70c-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': 'c3a680eb063c2b4d75a61704dee1413eb1ac4b7949d784ba22a814c8f47ce059'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.589386', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e0fb32b6-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': 'a8cecebe33ad950ca8080b0b9f9a9d7962a77a9e6b7e797875d3af6f8bbb193a'}]}, 'timestamp': '2026-01-12 13:44:55.593582', '_unique_id': '6e971a7d3a2647ffa7182732c755cc76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.597 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.599 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.616 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.616 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.636 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.write.requests volume: 321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.637 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1183930d-98c8-4026-a90c-4dd5ae540754', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-vda', 'timestamp': '2026-01-12T13:44:55.599174', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e0fec1c4-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': '7ec02af57df99b6390bc428086faed3663c3204d69b1f3601f72148965f882b3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-sda', 'timestamp': '2026-01-12T13:44:55.599174', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e0fecb24-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': '456c3ce3bed6d9186ff5eefb7c8c5c431648e48db57f5681333cf6efa629c732'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 321, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-vda', 'timestamp': '2026-01-12T13:44:55.599174', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e101db16-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': '9800eecea3143a06763aaa97c23268106e3eb296be864fc50dbefe4c77ea2529'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-sda', 'timestamp': '2026-01-12T13:44:55.599174', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e101e566-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': '3c201874c04decc4677de0b19a11a77329595e494d084194ea105ab20a14ee40'}]}, 'timestamp': '2026-01-12 13:44:55.637440', '_unique_id': 'c62a2e761e2b4bae8ad22d7afc61844b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.638 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2be96477-6bc0-4c68-be25-d42703965481', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.639050', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e1022d78-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': '6ada81d7943b5a515039178bd6176ed374ab8d5c9cab2007224983f8d96f5026'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.639050', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e10235fc-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': '0b2ba736a037d80a22c5c10339ef8de8c11035b5de5dbf5929acfc3f0f885243'}]}, 'timestamp': '2026-01-12 13:44:55.639496', '_unique_id': '8f742f34d68e4b408ad1cbdf763f0298'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.639 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.640 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.640 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.640 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-881283916>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-199675950>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-881283916>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-199675950>]
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.641 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.641 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.641 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.641 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.write.bytes volume: 72978432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.641 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cef1727-d734-4c1a-ab33-6f99e20c3206', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-vda', 'timestamp': '2026-01-12T13:44:55.641087', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1027cc4-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': '9184c34e3fe9fb620a9a7e8d3e0a96610d6afd0984307b918d548b3cf0d98058'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-sda', 'timestamp': '2026-01-12T13:44:55.641087', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e10284c6-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': '593230d85f1a025b1a74074e9118d3cf900b388499d87be20e8266bdcbda075a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72978432, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-vda', 'timestamp': '2026-01-12T13:44:55.641087', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1028cc8-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': '8fbd07d43a29bb45afff4ab58604639ec809f99fd69112e520b08af8c787b8d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-sda', 'timestamp': '2026-01-12T13:44:55.641087', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e102943e-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': 'd0b68e9db27db2ab5c853e4ebd07d2e61b9a9fb5f80615556109368dde4de1c2'}]}, 'timestamp': '2026-01-12 13:44:55.641920', '_unique_id': '7d22d17685f84461a45ba31c2c1dc564'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.642 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.643 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.643 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.643 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.read.requests volume: 1086 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.643 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd618210a-8b0a-4294-9805-5f13c60abada', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-vda', 'timestamp': '2026-01-12T13:44:55.643017', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e102c7ec-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': 'a417dcce2c4ce5c87817f4945be9825d28c3ff250d185983a75a931e1d5fb3c2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-sda', 'timestamp': '2026-01-12T13:44:55.643017', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e102cfda-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': '0762a65b1933cb77b0b1c9fca9af607f24bc854c7de0594605009db19db2fd7f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1086, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-vda', 'timestamp': '2026-01-12T13:44:55.643017', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e102d7aa-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': '8552e583316c3a15044aefb81a81b5eca156b7c69241e7c740ca08a9a9624e7a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-sda', 'timestamp': '2026-01-12T13:44:55.643017', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e102df0c-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': 'faa3bd61deb60518def37e809c270ae4850b09776455a48997278c16b89f9f2d'}]}, 'timestamp': '2026-01-12 13:44:55.643812', '_unique_id': 'a356c866f0ad4373b93193d80f9d8bc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.644 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1496f33-1bb4-4159-b1d1-f1d1ee80891c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.644925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e10312a6-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': '632be7ad6b4f3f39c44668d3080b6a71d2051ab078a77288ca064c8586f71392'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.644925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e1031ada-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': 'd1aba1b3821b47176075dbe7db187ba56580c609dc34ef19ef3a786aa8d42591'}]}, 'timestamp': '2026-01-12 13:44:55.645350', '_unique_id': 'd76e1665a2af47ac92512c833e8dadd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.645 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.646 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.646 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.646 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.outgoing.packets volume: 139 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '067a5552-b454-448b-a58e-1b24864fb724', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.646395', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e1034ece-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': '8c3acdfbdde0520b92b56a913f60125756a077228a8723296cb0f9ed3307d480'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 139, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.646395', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e1035770-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': '14826a885b44f272f510036bc15fc4e571d779446725e5e8af92a2769a5ca9f8'}]}, 'timestamp': '2026-01-12 13:44:55.646921', '_unique_id': 'ff90cd634001420f84e1fc3c0d3b1c16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.647 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.648 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.648 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.read.latency volume: 124775561 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.648 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.read.latency volume: 3451644 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.648 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.read.latency volume: 210502781 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.648 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.read.latency volume: 83013321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '014729f8-3180-4159-8e0b-10d64dad7d76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 124775561, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-vda', 'timestamp': '2026-01-12T13:44:55.648131', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e103906e-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': 'ed8b67c969a13d7d18ec011eefc79bab8b2f535d64286bb2a7176b7eda74b99c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3451644, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-sda', 'timestamp': '2026-01-12T13:44:55.648131', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e10398a2-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': 'd45fca0ecfa2d5a01e798781b50b2cabc452014d80dac645499d97bed2ea26aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 210502781, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-vda', 'timestamp': '2026-01-12T13:44:55.648131', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e103a0d6-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': 'd4c638622f1679fb4629cd0217668fbe8865928c350afb2283fcf8e6b6718c74'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 83013321, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-sda', 'timestamp': '2026-01-12T13:44:55.648131', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e103a8ce-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': 'de3e4e8d290bb35febe1d0468e6b964ddf096a5baf0c8c97f86181cdad12a13e'}]}, 'timestamp': '2026-01-12 13:44:55.648983', '_unique_id': 'cb9dc88171db4323a32c658b261e331a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.649 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.650 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.657 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.657 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.664 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.664 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd244b0c-d201-4e09-87f8-ad10cfd24913', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-vda', 'timestamp': '2026-01-12T13:44:55.650082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e104f8dc-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.717354527, 'message_signature': '902e9733b80b52c40711cc96a7c9bb6c8b831a3e51373acfdbaa03619c90cfe2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-sda', 'timestamp': '2026-01-12T13:44:55.650082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e10501b0-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.717354527, 'message_signature': 'df354deb25cb4899be937f6630101103785c55e408a1c61cc02ca78f3c1abef5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-vda', 'timestamp': '2026-01-12T13:44:55.650082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10617ee-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.724959694, 'message_signature': 'fa7a9c928bf8ed818460a86793da5825fbc7b6b471b306f1786e31f69075fc6f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-sda', 'timestamp': '2026-01-12T13:44:55.650082', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e10621c6-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.724959694, 'message_signature': '99fb121ed3397437bdc2d6e8916b4c0be51de8813c28fb7aeea185d4cb3e5861'}]}, 'timestamp': '2026-01-12 13:44:55.665185', '_unique_id': 'a781023cf2ad4ebb820f8528839fdc0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.665 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.666 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.666 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.666 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.666 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.read.bytes volume: 30202368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.666 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3ec7856-c981-4076-b088-6085641ae651', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-vda', 'timestamp': '2026-01-12T13:44:55.666341', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e106570e-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': '46e3b692c9a8efba02d877dc53d6ebf9366488fce3309e60c1f8c2c50587d845'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-sda', 'timestamp': '2026-01-12T13:44:55.666341', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1065f6a-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': 'c1756739e43dea931c1ccc1bf448d28cfc1c9763b63a16028ceb65a0b4c46a7a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30202368, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-vda', 'timestamp': '2026-01-12T13:44:55.666341', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10667e4-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': '8195159608ec9f16385ecd63da8eebb149990fac4b69d1b732f8fc8228cc5ae2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-sda', 'timestamp': '2026-01-12T13:44:55.666341', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1066fdc-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': '75150fb5c497326194e8806d73235e8570d05dbc01fc84367af6824c6d8f2014'}]}, 'timestamp': '2026-01-12 13:44:55.667181', '_unique_id': '32e609841e854ade8b97390958a33b44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.667 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.668 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.668 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.668 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8195d4ef-9333-4ee3-a68b-e70953c782ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.668433', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e106a9d4-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': 'e36eac9503171b50b822a571568597b255521bd55c3d484a23f3284eeb314dd4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.668433', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e106b276-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': '2ac057d54b08faf551fa57984bd2d5924169bf19524c11faaa373362306f7df2'}]}, 'timestamp': '2026-01-12 13:44:55.668915', '_unique_id': '81413a148b074bcda6369ff7f5181364'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.669 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.incoming.bytes volume: 24750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35dbddd6-8606-457d-846f-152eea2b85fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.669957', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e106e444-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': 'c822e6e1bd9436a6bb5db27d3d7d8b935094296fef84ccf122a906fbabad02f2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 24750, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.669957', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e106ec64-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': '6f276b33c371f57fd5ed5c23bdfbf55944d0aab9e6a01256cf07c52a352fff81'}]}, 'timestamp': '2026-01-12 13:44:55.670377', '_unique_id': '1e9eadb8a9c24b06bb178ae553b9fee2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.670 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.671 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.671 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.671 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05b79b8b-9446-49bf-9b58-4c497893eeba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.671408', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e1071d2e-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': '2ff23d9291e145c1f041f7c521daf4b1d8b7bb24816da2c4ebe80c5b7b923b44'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.671408', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e1072562-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': '6e4cfeaed999626b90577a9d739fff05bef4fc5fe10536e00d5a8989f9b4eec3'}]}, 'timestamp': '2026-01-12 13:44:55.671836', '_unique_id': 'd062bb7045434c45bf7e939fcc5ef726'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.672 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-881283916>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-199675950>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-881283916>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-199675950>]
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.673 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.673 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.incoming.packets volume: 132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd241480e-d843-41b7-ab96-922e16dfb19b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.673170', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e10761e4-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': 'd508bcfc7ffd662edb6155b7527218d9ba84b0f9f3a03d0ce30d217432e9ddd3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 132, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.673170', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e10769e6-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': '68c543123029b6f0ab7c06f83d8d08f82f2941ee59d5c9c0d44e98fed44f1b56'}]}, 'timestamp': '2026-01-12 13:44:55.673597', '_unique_id': '81e0312fae5a4326902ea922c28f9d3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.674 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4029496c-662d-4be3-b05c-a2c8f5bb5892', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.674632', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e1079ad8-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': '36e27cf76b0186202a4d16a8a38f79dd92bb8a001286601bce776f98053dfb8a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.674632', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e107a3ca-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': '6f5699711fa9240ce4f377f778fd5a5b5806ad6420c822946c758b346b4d02ef'}]}, 'timestamp': '2026-01-12 13:44:55.675074', '_unique_id': '38ffc463c06c444785669e0cf8ef05c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.675 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.676 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.676 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.676 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.676 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.write.latency volume: 309026077 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.676 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55fbec1e-7ee1-40c6-b5ee-8a1bd5e51359', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-vda', 'timestamp': '2026-01-12T13:44:55.676096', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e107d3ea-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': '6f7b307afd68cfaebbf23ff578ec9faa29c9f0b9cb61652bd819bf4220cb8bb0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-sda', 'timestamp': '2026-01-12T13:44:55.676096', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e107dbb0-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.666403998, 'message_signature': '77d456d3ae716e1214d7a0c894d299149449ba781813adfa9bc6766e6089a127'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 309026077, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-vda', 'timestamp': '2026-01-12T13:44:55.676096', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e107e39e-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': '2786fde70faf57b8fb6b76c3ecf1bf0eefd3e75f30950ddfdaed527f65d7f6ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-sda', 'timestamp': '2026-01-12T13:44:55.676096', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e107eaec-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.684244029, 'message_signature': 'ab4f962c3b2bad326a06f79adfba738a15e0fdd131c51c0f690fd81b47797a2a'}]}, 'timestamp': '2026-01-12 13:44:55.676901', '_unique_id': 'a0ec0d6f913d48f7b0c77be871f8f4b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.677 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.678 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-881283916>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-199675950>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-881283916>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-199675950>]
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.678 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.697 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/cpu volume: 4710000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.719 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/cpu volume: 9820000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f8ce1a7-4a56-4d3b-bfda-61018d95f8ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4710000000, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'timestamp': '2026-01-12T13:44:55.678249', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e10b29b4-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.764855836, 'message_signature': 'c7f4c97dc8396ddf665206f64e9b87fe11f2efe41295ef4e5233fbb949771144'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9820000000, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'timestamp': '2026-01-12T13:44:55.678249', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e10e8744-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.78710415, 'message_signature': '5bac86af385efe366358f90256a543615361557754a0462acded09a899a7cb98'}]}, 'timestamp': '2026-01-12 13:44:55.720221', '_unique_id': 'd9b35753bbf84631bf72d781a26582ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.720 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.721 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.721 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.721 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/network.outgoing.bytes volume: 20610 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22b0feec-9520-45d6-ab87-efaf67c51aa2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000002-a75f5a44-2ce8-42d7-97fb-0a18198794ef-tap046c8a5e-95', 'timestamp': '2026-01-12T13:44:55.721321', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'tap046c8a5e-95', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:ea:5a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046c8a5e-95'}, 'message_id': 'e10ebaca-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.656652684, 'message_signature': '7d6cc93db7cf64542b1d6fb49108dadb3a23cc482d6c1edd74f1b10224184f46'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 20610, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000001-9828d316-7b89-422e-a561-fad4ab8d9a5a-tapa2325a00-e1', 'timestamp': '2026-01-12T13:44:55.721321', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'tapa2325a00-e1', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:18:95:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2325a00-e1'}, 'message_id': 'e10ec39e-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.65885135, 'message_signature': 'b527d531e792af09117a6870274835feae83430dc822c8f76ca01e556d95338c'}]}, 'timestamp': '2026-01-12 13:44:55.721763', '_unique_id': '0ec5919a4ec94a3590adb89f73d1e613'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.722 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.723 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.723 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.723 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ca2d223-2475-408a-a797-6432b360dbc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-vda', 'timestamp': '2026-01-12T13:44:55.722824', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10ef62a-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.717354527, 'message_signature': '08d06f42a57080d25ef47360b0ad26b08c9b3e1ede325543a7658ced6515610a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-sda', 'timestamp': '2026-01-12T13:44:55.722824', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e10efddc-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.717354527, 'message_signature': '407080821b7a2e8d8eacead671979d3d0d82f9c091a8c1ee7930ab79fb219953'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-vda', 'timestamp': '2026-01-12T13:44:55.722824', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10f0502-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.724959694, 'message_signature': '03e13bffbff850d8522d69e3b4e5883f08a6165faeecab43360767549a2fe931'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-sda', 'timestamp': '2026-01-12T13:44:55.722824', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e10f0c64-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.724959694, 'message_signature': '04076b5c0abbee8e5aa19918f95ee4d4faddfa3713e3f0f2a6a1c19f2db52678'}]}, 'timestamp': '2026-01-12 13:44:55.723620', '_unique_id': '0c140917dca241ca87f5e01a3a626681'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-881283916>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-199675950>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-881283916>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-199675950>]
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.724 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.725 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.725 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.725 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.725 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ede14ce-51da-4438-afe0-e63c6df6c0e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-vda', 'timestamp': '2026-01-12T13:44:55.725027', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10f4b7a-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.717354527, 'message_signature': '22b2c22fcdccb5ac37bc9a4a5e8c3044c610f23470356f3a9ee64e3e21cdc663'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef-sda', 'timestamp': '2026-01-12T13:44:55.725027', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-881283916', 'name': 'instance-00000002', 'instance_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e10f5340-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.717354527, 'message_signature': '14560b6dfb425edabef30acdf85a9e1e873d0e9b04811ec79cdf6e0e2addc86d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-vda', 'timestamp': '2026-01-12T13:44:55.725027', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e10f5b2e-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.724959694, 'message_signature': '2d8761669f8b7b16631f87382611351d0c372d3709768c11ae0c1fcee4a6aeb8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a-sda', 'timestamp': '2026-01-12T13:44:55.725027', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e10f627c-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.724959694, 'message_signature': '6dee393ac9bcd87269c3950f1a3f36277694bd9fc3ba6dafe96b9ba70d40ce17'}]}, 'timestamp': '2026-01-12 13:44:55.725821', '_unique_id': 'e43ca7939bfa4b2aae562743d2762955'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.726 12 DEBUG ceilometer.compute.pollsters [-] a75f5a44-2ce8-42d7-97fb-0a18198794ef/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance a75f5a44-2ce8-42d7-97fb-0a18198794ef: ceilometer.compute.pollsters.NoVolumeException
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 DEBUG ceilometer.compute.pollsters [-] 9828d316-7b89-422e-a561-fad4ab8d9a5a/memory.usage volume: 42.95703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '739d9d5d-38c8-42e6-8cb5-476840e1e2ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.95703125, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'timestamp': '2026-01-12T13:44:55.726926', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-199675950', 'name': 'instance-00000001', 'instance_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e10f9c06-efbc-11f0-9e2d-fa163ee03944', 'monotonic_time': 2561.78710415, 'message_signature': 'a975e0617bf45bafc344d8b1e4073e7f7990bd1676e640bf83bc33d12cdbb888'}]}, 'timestamp': '2026-01-12 13:44:55.727304', '_unique_id': 'ac0132ac391e4440ab7a4aafa37da8d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:44:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:44:55.727 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:44:56 compute-0 nova_compute[181978]: 2026-01-12 13:44:56.190 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:56 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:56.191 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:44:56 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:56.193 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:44:56 compute-0 podman[210304]: 2026-01-12 13:44:56.546385449 +0000 UTC m=+0.038613942 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 12 13:44:57 compute-0 nova_compute[181978]: 2026-01-12 13:44:57.144 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:44:58 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:44:58.194 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:44:58 compute-0 nova_compute[181978]: 2026-01-12 13:44:58.233 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:00 compute-0 podman[210333]: 2026-01-12 13:45:00.548639264 +0000 UTC m=+0.042662658 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Jan 12 13:45:01 compute-0 ovn_controller[94974]: 2026-01-12T13:45:01Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fa:ea:5a 10.100.0.22
Jan 12 13:45:01 compute-0 ovn_controller[94974]: 2026-01-12T13:45:01Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fa:ea:5a 10.100.0.22
Jan 12 13:45:02 compute-0 nova_compute[181978]: 2026-01-12 13:45:02.144 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:03 compute-0 nova_compute[181978]: 2026-01-12 13:45:03.235 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:07 compute-0 nova_compute[181978]: 2026-01-12 13:45:07.146 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:08 compute-0 nova_compute[181978]: 2026-01-12 13:45:08.237 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:09 compute-0 podman[210352]: 2026-01-12 13:45:09.566482728 +0000 UTC m=+0.060867254 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.418 181991 DEBUG oslo_concurrency.lockutils [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.419 181991 DEBUG oslo_concurrency.lockutils [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.419 181991 DEBUG oslo_concurrency.lockutils [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.419 181991 DEBUG oslo_concurrency.lockutils [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.419 181991 DEBUG oslo_concurrency.lockutils [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.420 181991 INFO nova.compute.manager [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Terminating instance
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.421 181991 DEBUG nova.compute.manager [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:45:10 compute-0 kernel: tap046c8a5e-95 (unregistering): left promiscuous mode
Jan 12 13:45:10 compute-0 NetworkManager[55211]: <info>  [1768225510.4445] device (tap046c8a5e-95): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00039|binding|INFO|Releasing lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af from this chassis (sb_readonly=0)
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00040|binding|INFO|Setting lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af down in Southbound
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00041|binding|INFO|Removing iface tap046c8a5e-95 ovn-installed in OVS
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.451 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.453 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.455 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:ea:5a 10.100.0.22'], port_security=['fa:16:3e:fa:ea:5a 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d644de00-cee9-4519-857b-9315d08f3a6b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99cc2e72-a1a5-4ad5-83b1-2112f5ce8e09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cbc8ad1-eb42-4a9d-8275-a3ca6b307017, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=046c8a5e-9569-4f66-8722-ff77b7dbf6af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.456 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 046c8a5e-9569-4f66-8722-ff77b7dbf6af in datapath d644de00-cee9-4519-857b-9315d08f3a6b unbound from our chassis
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.457 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d644de00-cee9-4519-857b-9315d08f3a6b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.457 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[fa9d5c2b-ddb8-47b8-aa23-1e9c983b1565]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.458 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b namespace which is not needed anymore
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.466 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 12 13:45:10 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 11.423s CPU time.
Jan 12 13:45:10 compute-0 systemd-machined[153581]: Machine qemu-2-instance-00000002 terminated.
Jan 12 13:45:10 compute-0 neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b[210286]: [NOTICE]   (210290) : haproxy version is 2.8.14-c23fe91
Jan 12 13:45:10 compute-0 neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b[210286]: [NOTICE]   (210290) : path to executable is /usr/sbin/haproxy
Jan 12 13:45:10 compute-0 neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b[210286]: [WARNING]  (210290) : Exiting Master process...
Jan 12 13:45:10 compute-0 neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b[210286]: [ALERT]    (210290) : Current worker (210292) exited with code 143 (Terminated)
Jan 12 13:45:10 compute-0 neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b[210286]: [WARNING]  (210290) : All workers exited. Exiting... (0)
Jan 12 13:45:10 compute-0 systemd[1]: libpod-22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071.scope: Deactivated successfully.
Jan 12 13:45:10 compute-0 podman[210397]: 2026-01-12 13:45:10.553056447 +0000 UTC m=+0.032973600 container died 22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 12 13:45:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071-userdata-shm.mount: Deactivated successfully.
Jan 12 13:45:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-54f9c26296d2ba2fe3a1b8e8f043dc6976830a67acd46f3ccb7c49e7af0cb4c2-merged.mount: Deactivated successfully.
Jan 12 13:45:10 compute-0 podman[210397]: 2026-01-12 13:45:10.586169631 +0000 UTC m=+0.066086783 container cleanup 22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 12 13:45:10 compute-0 systemd[1]: libpod-conmon-22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071.scope: Deactivated successfully.
Jan 12 13:45:10 compute-0 podman[210409]: 2026-01-12 13:45:10.619388642 +0000 UTC m=+0.055027439 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 12 13:45:10 compute-0 kernel: tap046c8a5e-95: entered promiscuous mode
Jan 12 13:45:10 compute-0 NetworkManager[55211]: <info>  [1768225510.6323] manager: (tap046c8a5e-95): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Jan 12 13:45:10 compute-0 systemd-udevd[210380]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00042|binding|INFO|Claiming lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af for this chassis.
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00043|binding|INFO|046c8a5e-9569-4f66-8722-ff77b7dbf6af: Claiming fa:16:3e:fa:ea:5a 10.100.0.22
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.634 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 kernel: tap046c8a5e-95 (unregistering): left promiscuous mode
Jan 12 13:45:10 compute-0 podman[210438]: 2026-01-12 13:45:10.643936131 +0000 UTC m=+0.037759565 container remove 22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 12 13:45:10 compute-0 podman[210416]: 2026-01-12 13:45:10.647798908 +0000 UTC m=+0.080014275 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, version=9.6, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.651 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:ea:5a 10.100.0.22'], port_security=['fa:16:3e:fa:ea:5a 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d644de00-cee9-4519-857b-9315d08f3a6b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99cc2e72-a1a5-4ad5-83b1-2112f5ce8e09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cbc8ad1-eb42-4a9d-8275-a3ca6b307017, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=046c8a5e-9569-4f66-8722-ff77b7dbf6af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00044|binding|INFO|Setting lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af ovn-installed in OVS
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00045|binding|INFO|Setting lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af up in Southbound
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00046|binding|INFO|Releasing lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af from this chassis (sb_readonly=1)
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00047|if_status|INFO|Not setting lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af down as sb is readonly
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00048|binding|INFO|Removing iface tap046c8a5e-95 ovn-installed in OVS
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.653 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e18a8abb-d89c-4be0-b84f-eb883533b852]: (4, ('Mon Jan 12 01:45:10 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b (22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071)\n22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071\nMon Jan 12 01:45:10 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b (22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071)\n22beb080f948974dc5e8cafd3e43297acc18a8da4c819d4f3078cc5411503071\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.655 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[836932da-23da-4adc-a3fd-7e8b9168a8e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.655 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.656 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.657 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd644de00-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00049|binding|INFO|Releasing lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af from this chassis (sb_readonly=0)
Jan 12 13:45:10 compute-0 ovn_controller[94974]: 2026-01-12T13:45:10Z|00050|binding|INFO|Setting lport 046c8a5e-9569-4f66-8722-ff77b7dbf6af down in Southbound
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.659 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.668 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.668 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:ea:5a 10.100.0.22'], port_security=['fa:16:3e:fa:ea:5a 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'a75f5a44-2ce8-42d7-97fb-0a18198794ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d644de00-cee9-4519-857b-9315d08f3a6b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99cc2e72-a1a5-4ad5-83b1-2112f5ce8e09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cbc8ad1-eb42-4a9d-8275-a3ca6b307017, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=046c8a5e-9569-4f66-8722-ff77b7dbf6af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.670 181991 INFO nova.virt.libvirt.driver [-] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Instance destroyed successfully.
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.671 181991 DEBUG nova.objects.instance [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid a75f5a44-2ce8-42d7-97fb-0a18198794ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:45:10 compute-0 kernel: tapd644de00-c0: left promiscuous mode
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.680 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.681 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.683 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c3d84c-e526-4242-a23f-6c9de09058aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.688 181991 DEBUG nova.virt.libvirt.vif [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:44:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-881283916',display_name='tempest-TestNetworkBasicOps-server-881283916',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-881283916',id=2,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8VXsWOgksxgSgZGOO55gaORCF3OlcQ+IPqaCCnCta/+H8teqO4aQ9Z/d8/GOEzwFb3jL331i/9/p3enwGNWusccmSiQgSzzYrYKfQ3maOnwB9S22f9FoR84rXMgBhu2Q==',key_name='tempest-TestNetworkBasicOps-1328257164',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:44:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-01t2hs0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:44:50Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a75f5a44-2ce8-42d7-97fb-0a18198794ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.688 181991 DEBUG nova.network.os_vif_util [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "address": "fa:16:3e:fa:ea:5a", "network": {"id": "d644de00-cee9-4519-857b-9315d08f3a6b", "bridge": "br-int", "label": "tempest-network-smoke--894803923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046c8a5e-95", "ovs_interfaceid": "046c8a5e-9569-4f66-8722-ff77b7dbf6af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.689 181991 DEBUG nova.network.os_vif_util [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:ea:5a,bridge_name='br-int',has_traffic_filtering=True,id=046c8a5e-9569-4f66-8722-ff77b7dbf6af,network=Network(d644de00-cee9-4519-857b-9315d08f3a6b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046c8a5e-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.689 181991 DEBUG os_vif [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:ea:5a,bridge_name='br-int',has_traffic_filtering=True,id=046c8a5e-9569-4f66-8722-ff77b7dbf6af,network=Network(d644de00-cee9-4519-857b-9315d08f3a6b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046c8a5e-95') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.691 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.691 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap046c8a5e-95, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.692 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[98dec3a8-ab32-4a13-b2e2-2b6addb47d75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.692 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.693 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[019f0447-9de0-409e-a77a-b6fce0e2fa98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.695 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.696 181991 INFO os_vif [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:ea:5a,bridge_name='br-int',has_traffic_filtering=True,id=046c8a5e-9569-4f66-8722-ff77b7dbf6af,network=Network(d644de00-cee9-4519-857b-9315d08f3a6b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046c8a5e-95')
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.696 181991 INFO nova.virt.libvirt.driver [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Deleting instance files /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef_del
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.697 181991 INFO nova.virt.libvirt.driver [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Deletion of /var/lib/nova/instances/a75f5a44-2ce8-42d7-97fb-0a18198794ef_del complete
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.705 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f9ea59-be9a-4e02-bc8d-d41d49169038]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 255507, 'reachable_time': 43483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210484, 'error': None, 'target': 'ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 systemd[1]: run-netns-ovnmeta\x2dd644de00\x2dcee9\x2d4519\x2d857b\x2d9315d08f3a6b.mount: Deactivated successfully.
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.711 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d644de00-cee9-4519-857b-9315d08f3a6b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.712 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[4e3c2f78-8ede-4f86-a70b-87b0020788b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.713 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 046c8a5e-9569-4f66-8722-ff77b7dbf6af in datapath d644de00-cee9-4519-857b-9315d08f3a6b unbound from our chassis
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.714 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d644de00-cee9-4519-857b-9315d08f3a6b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.715 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[c00394c8-70c9-4086-9e10-2f47f5b788ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.715 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 046c8a5e-9569-4f66-8722-ff77b7dbf6af in datapath d644de00-cee9-4519-857b-9315d08f3a6b unbound from our chassis
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.716 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d644de00-cee9-4519-857b-9315d08f3a6b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:45:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:10.716 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[6e7bf113-dd21-417d-b019-45da366eff9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.755 181991 DEBUG nova.virt.libvirt.host [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.755 181991 INFO nova.virt.libvirt.host [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] UEFI support detected
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.756 181991 INFO nova.compute.manager [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.756 181991 DEBUG oslo.service.loopingcall [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.758 181991 DEBUG nova.compute.manager [-] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.758 181991 DEBUG nova.network.neutron [-] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.764 181991 DEBUG nova.compute.manager [req-861d99ef-20b1-4bc9-b834-f44780e31e7e req-70c7d546-3077-4b60-9979-b83a3ba9dec9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-unplugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.764 181991 DEBUG oslo_concurrency.lockutils [req-861d99ef-20b1-4bc9-b834-f44780e31e7e req-70c7d546-3077-4b60-9979-b83a3ba9dec9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.764 181991 DEBUG oslo_concurrency.lockutils [req-861d99ef-20b1-4bc9-b834-f44780e31e7e req-70c7d546-3077-4b60-9979-b83a3ba9dec9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.765 181991 DEBUG oslo_concurrency.lockutils [req-861d99ef-20b1-4bc9-b834-f44780e31e7e req-70c7d546-3077-4b60-9979-b83a3ba9dec9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.765 181991 DEBUG nova.compute.manager [req-861d99ef-20b1-4bc9-b834-f44780e31e7e req-70c7d546-3077-4b60-9979-b83a3ba9dec9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] No waiting events found dispatching network-vif-unplugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:45:10 compute-0 nova_compute[181978]: 2026-01-12 13:45:10.765 181991 DEBUG nova.compute.manager [req-861d99ef-20b1-4bc9-b834-f44780e31e7e req-70c7d546-3077-4b60-9979-b83a3ba9dec9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-unplugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.356 181991 DEBUG nova.network.neutron [-] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.365 181991 INFO nova.compute.manager [-] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Took 0.61 seconds to deallocate network for instance.
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.393 181991 DEBUG oslo_concurrency.lockutils [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.393 181991 DEBUG oslo_concurrency.lockutils [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.426 181991 DEBUG nova.compute.manager [req-88613680-3b5f-42e3-a72f-f0ecf6c1a3f4 req-c99aaeef-8c29-44e3-bf5e-e3d41cfe1b7b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-deleted-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.457 181991 DEBUG nova.compute.provider_tree [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.466 181991 DEBUG nova.scheduler.client.report [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.477 181991 DEBUG oslo_concurrency.lockutils [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.493 181991 INFO nova.scheduler.client.report [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance a75f5a44-2ce8-42d7-97fb-0a18198794ef
Jan 12 13:45:11 compute-0 nova_compute[181978]: 2026-01-12 13:45:11.538 181991 DEBUG oslo_concurrency.lockutils [None req-10b3f780-0dcc-460a-81ec-67a31cd4fe37 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.148 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:12 compute-0 ovn_controller[94974]: 2026-01-12T13:45:12Z|00051|binding|INFO|Releasing lport a3bb26f6-67f0-4092-965f-456ffbc02ff3 from this chassis (sb_readonly=0)
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.838 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.839 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.839 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.839 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.839 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] No waiting events found dispatching network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.840 181991 WARNING nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received unexpected event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af for instance with vm_state deleted and task_state None.
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.840 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.840 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.840 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.841 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.841 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] No waiting events found dispatching network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.841 181991 WARNING nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received unexpected event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af for instance with vm_state deleted and task_state None.
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.841 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.842 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.842 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.842 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.842 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] No waiting events found dispatching network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.843 181991 WARNING nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received unexpected event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af for instance with vm_state deleted and task_state None.
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.843 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-unplugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.843 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.843 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.843 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.844 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] No waiting events found dispatching network-vif-unplugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.844 181991 WARNING nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received unexpected event network-vif-unplugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af for instance with vm_state deleted and task_state None.
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.844 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.844 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.845 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.845 181991 DEBUG oslo_concurrency.lockutils [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a75f5a44-2ce8-42d7-97fb-0a18198794ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.845 181991 DEBUG nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] No waiting events found dispatching network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.845 181991 WARNING nova.compute.manager [req-fc95eaf8-edcc-4268-a9ce-4419f495e452 req-3605bc97-2c40-4597-aecd-78a59b0c9511 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Received unexpected event network-vif-plugged-046c8a5e-9569-4f66-8722-ff77b7dbf6af for instance with vm_state deleted and task_state None.
Jan 12 13:45:12 compute-0 nova_compute[181978]: 2026-01-12 13:45:12.848 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.479 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.495 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.495 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.495 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.508 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.542 181991 DEBUG oslo_concurrency.lockutils [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "9828d316-7b89-422e-a561-fad4ab8d9a5a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.542 181991 DEBUG oslo_concurrency.lockutils [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.542 181991 DEBUG oslo_concurrency.lockutils [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.542 181991 DEBUG oslo_concurrency.lockutils [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.543 181991 DEBUG oslo_concurrency.lockutils [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.543 181991 INFO nova.compute.manager [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Terminating instance
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.544 181991 DEBUG nova.compute.manager [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:45:13 compute-0 kernel: tapa2325a00-e1 (unregistering): left promiscuous mode
Jan 12 13:45:13 compute-0 NetworkManager[55211]: <info>  [1768225513.5674] device (tapa2325a00-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.571 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:13 compute-0 ovn_controller[94974]: 2026-01-12T13:45:13Z|00052|binding|INFO|Releasing lport a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c from this chassis (sb_readonly=0)
Jan 12 13:45:13 compute-0 ovn_controller[94974]: 2026-01-12T13:45:13Z|00053|binding|INFO|Setting lport a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c down in Southbound
Jan 12 13:45:13 compute-0 ovn_controller[94974]: 2026-01-12T13:45:13Z|00054|binding|INFO|Removing iface tapa2325a00-e1 ovn-installed in OVS
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.578 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:95:cb 10.100.0.12'], port_security=['fa:16:3e:18:95:cb 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9828d316-7b89-422e-a561-fad4ab8d9a5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e17082c0-6f0c-461a-a787-3e29d33c0965', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6348b0d4-7e0c-43c9-b18b-1192deea413c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f553e1bd-6f22-4cf5-81ec-96063b5f8304, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.579 104189 INFO neutron.agent.ovn.metadata.agent [-] Port a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c in datapath e17082c0-6f0c-461a-a787-3e29d33c0965 unbound from our chassis
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.580 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e17082c0-6f0c-461a-a787-3e29d33c0965, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.580 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[18c7e2c5-1d9b-4de0-89ef-253acc5c5ae5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.581 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965 namespace which is not needed anymore
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.587 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:13 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 12 13:45:13 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 12.173s CPU time.
Jan 12 13:45:13 compute-0 systemd-machined[153581]: Machine qemu-1-instance-00000001 terminated.
Jan 12 13:45:13 compute-0 neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965[210041]: [NOTICE]   (210045) : haproxy version is 2.8.14-c23fe91
Jan 12 13:45:13 compute-0 neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965[210041]: [NOTICE]   (210045) : path to executable is /usr/sbin/haproxy
Jan 12 13:45:13 compute-0 neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965[210041]: [ALERT]    (210045) : Current worker (210047) exited with code 143 (Terminated)
Jan 12 13:45:13 compute-0 neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965[210041]: [WARNING]  (210045) : All workers exited. Exiting... (0)
Jan 12 13:45:13 compute-0 systemd[1]: libpod-1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5.scope: Deactivated successfully.
Jan 12 13:45:13 compute-0 podman[210508]: 2026-01-12 13:45:13.672196292 +0000 UTC m=+0.032116018 container died 1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 12 13:45:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5-userdata-shm.mount: Deactivated successfully.
Jan 12 13:45:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-e9c41b9f9904f5f6e6a2e63fac0cd7b1eac772e579467896fc8f33f5b15ffea0-merged.mount: Deactivated successfully.
Jan 12 13:45:13 compute-0 podman[210508]: 2026-01-12 13:45:13.692707599 +0000 UTC m=+0.052627324 container cleanup 1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:45:13 compute-0 systemd[1]: libpod-conmon-1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5.scope: Deactivated successfully.
Jan 12 13:45:13 compute-0 podman[210531]: 2026-01-12 13:45:13.728319432 +0000 UTC m=+0.022168015 container remove 1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.731 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ca533eb5-2058-417f-8b7b-79e4fc35e59d]: (4, ('Mon Jan 12 01:45:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965 (1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5)\n1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5\nMon Jan 12 01:45:13 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965 (1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5)\n1c0e46ae987bb03df773dae6708cdf5aafcb6b77b90b876562c82c4520caa5c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.732 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[09317557-67c5-4342-85be-f1560bb6bc74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.733 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape17082c0-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.736 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:13 compute-0 kernel: tape17082c0-60: left promiscuous mode
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.747 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.750 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.752 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9bf7cb-2b23-419d-ab12-45b24721aa71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.762 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f13d4d-f266-4af1-9263-c4c27c805b20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.762 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8bfb6c72-fd5b-4d95-9289-c04007591a9a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.773 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[c634a7a4-4bf6-4ddf-bf69-77d52ed0c9c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 252364, 'reachable_time': 17492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210554, 'error': None, 'target': 'ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.775 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e17082c0-6f0c-461a-a787-3e29d33c0965 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:45:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:13.775 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc89522-d8b3-4195-b673-054e7df9f9f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:13 compute-0 systemd[1]: run-netns-ovnmeta\x2de17082c0\x2d6f0c\x2d461a\x2da787\x2d3e29d33c0965.mount: Deactivated successfully.
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.788 181991 INFO nova.virt.libvirt.driver [-] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Instance destroyed successfully.
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.789 181991 DEBUG nova.objects.instance [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid 9828d316-7b89-422e-a561-fad4ab8d9a5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.803 181991 DEBUG nova.virt.libvirt.vif [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:44:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-199675950',display_name='tempest-TestNetworkBasicOps-server-199675950',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-199675950',id=1,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBITi0YMEMSb5EvQ/nE1lRL+KinozAtP7g7HW8TDnURVyfCEr6LGxgSnlcrD0JrfvV3bguWTEDFEDbbrirZoF9elh+X76j9Nfs8oPdJ+pu8HpUc5wGK0VIDXn7GVcMT8a9Q==',key_name='tempest-TestNetworkBasicOps-1498408073',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:44:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-tx90ib4j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:44:15Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=9828d316-7b89-422e-a561-fad4ab8d9a5a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.804 181991 DEBUG nova.network.os_vif_util [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "address": "fa:16:3e:18:95:cb", "network": {"id": "e17082c0-6f0c-461a-a787-3e29d33c0965", "bridge": "br-int", "label": "tempest-network-smoke--600182846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2325a00-e1", "ovs_interfaceid": "a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.804 181991 DEBUG nova.network.os_vif_util [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:18:95:cb,bridge_name='br-int',has_traffic_filtering=True,id=a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c,network=Network(e17082c0-6f0c-461a-a787-3e29d33c0965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2325a00-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.804 181991 DEBUG os_vif [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:18:95:cb,bridge_name='br-int',has_traffic_filtering=True,id=a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c,network=Network(e17082c0-6f0c-461a-a787-3e29d33c0965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2325a00-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.805 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.806 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2325a00-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.806 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.808 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.809 181991 INFO os_vif [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:18:95:cb,bridge_name='br-int',has_traffic_filtering=True,id=a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c,network=Network(e17082c0-6f0c-461a-a787-3e29d33c0965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2325a00-e1')
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.810 181991 INFO nova.virt.libvirt.driver [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Deleting instance files /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a_del
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.810 181991 INFO nova.virt.libvirt.driver [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Deletion of /var/lib/nova/instances/9828d316-7b89-422e-a561-fad4ab8d9a5a_del complete
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.848 181991 INFO nova.compute.manager [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Took 0.30 seconds to destroy the instance on the hypervisor.
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.848 181991 DEBUG oslo.service.loopingcall [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.848 181991 DEBUG nova.compute.manager [-] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:45:13 compute-0 nova_compute[181978]: 2026-01-12 13:45:13.849 181991 DEBUG nova.network.neutron [-] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.361 181991 DEBUG nova.network.neutron [-] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.370 181991 INFO nova.compute.manager [-] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Took 0.52 seconds to deallocate network for instance.
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.399 181991 DEBUG oslo_concurrency.lockutils [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.399 181991 DEBUG oslo_concurrency.lockutils [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.433 181991 DEBUG nova.compute.provider_tree [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.440 181991 DEBUG nova.scheduler.client.report [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.565 181991 DEBUG oslo_concurrency.lockutils [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.582 181991 INFO nova.scheduler.client.report [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance 9828d316-7b89-422e-a561-fad4ab8d9a5a
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.622 181991 DEBUG oslo_concurrency.lockutils [None req-306293b0-128f-4570-811b-d261dfdfef51 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.918 181991 DEBUG nova.compute.manager [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received event network-changed-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.918 181991 DEBUG nova.compute.manager [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Refreshing instance network info cache due to event network-changed-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.919 181991 DEBUG oslo_concurrency.lockutils [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.919 181991 DEBUG oslo_concurrency.lockutils [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:45:14 compute-0 nova_compute[181978]: 2026-01-12 13:45:14.919 181991 DEBUG nova.network.neutron [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Refreshing network info cache for port a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:45:15 compute-0 nova_compute[181978]: 2026-01-12 13:45:15.001 181991 DEBUG nova.network.neutron [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:45:15 compute-0 nova_compute[181978]: 2026-01-12 13:45:15.515 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:15 compute-0 nova_compute[181978]: 2026-01-12 13:45:15.516 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:45:15 compute-0 nova_compute[181978]: 2026-01-12 13:45:15.516 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:45:15 compute-0 nova_compute[181978]: 2026-01-12 13:45:15.528 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:45:15 compute-0 nova_compute[181978]: 2026-01-12 13:45:15.528 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:15 compute-0 nova_compute[181978]: 2026-01-12 13:45:15.528 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.168 181991 DEBUG nova.network.neutron [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.168 181991 DEBUG oslo_concurrency.lockutils [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-9828d316-7b89-422e-a561-fad4ab8d9a5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.168 181991 DEBUG nova.compute.manager [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received event network-vif-unplugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.168 181991 DEBUG oslo_concurrency.lockutils [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.169 181991 DEBUG oslo_concurrency.lockutils [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.169 181991 DEBUG oslo_concurrency.lockutils [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.169 181991 DEBUG nova.compute.manager [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] No waiting events found dispatching network-vif-unplugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.169 181991 WARNING nova.compute.manager [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received unexpected event network-vif-unplugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c for instance with vm_state deleted and task_state None.
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.169 181991 DEBUG nova.compute.manager [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received event network-vif-plugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.170 181991 DEBUG oslo_concurrency.lockutils [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.170 181991 DEBUG oslo_concurrency.lockutils [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.170 181991 DEBUG oslo_concurrency.lockutils [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "9828d316-7b89-422e-a561-fad4ab8d9a5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.170 181991 DEBUG nova.compute.manager [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] No waiting events found dispatching network-vif-plugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.170 181991 WARNING nova.compute.manager [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received unexpected event network-vif-plugged-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c for instance with vm_state deleted and task_state None.
Jan 12 13:45:16 compute-0 nova_compute[181978]: 2026-01-12 13:45:16.170 181991 DEBUG nova.compute.manager [req-c815c286-f209-434e-8882-2cdf4c8fc271 req-4b3b6339-9cb7-435d-948f-f5102378341b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Received event network-vif-deleted-a2325a00-e1ba-4fe0-8d9d-a4eb59ecf58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.151 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.501 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.502 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.502 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.502 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:45:17 compute-0 podman[210562]: 2026-01-12 13:45:17.570425357 +0000 UTC m=+0.061009702 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.710 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.711 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5738MB free_disk=73.38475799560547GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.711 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.711 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.788 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.788 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.834 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.845 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.858 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:45:17 compute-0 nova_compute[181978]: 2026-01-12 13:45:17.858 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:18 compute-0 nova_compute[181978]: 2026-01-12 13:45:18.807 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:18 compute-0 nova_compute[181978]: 2026-01-12 13:45:18.854 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:18 compute-0 nova_compute[181978]: 2026-01-12 13:45:18.854 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:18 compute-0 nova_compute[181978]: 2026-01-12 13:45:18.855 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:45:18 compute-0 nova_compute[181978]: 2026-01-12 13:45:18.855 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:45:19 compute-0 nova_compute[181978]: 2026-01-12 13:45:19.220 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:19 compute-0 nova_compute[181978]: 2026-01-12 13:45:19.294 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:22 compute-0 nova_compute[181978]: 2026-01-12 13:45:22.152 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:23 compute-0 nova_compute[181978]: 2026-01-12 13:45:23.807 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:25 compute-0 nova_compute[181978]: 2026-01-12 13:45:25.668 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225510.6675484, a75f5a44-2ce8-42d7-97fb-0a18198794ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:45:25 compute-0 nova_compute[181978]: 2026-01-12 13:45:25.669 181991 INFO nova.compute.manager [-] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] VM Stopped (Lifecycle Event)
Jan 12 13:45:25 compute-0 nova_compute[181978]: 2026-01-12 13:45:25.689 181991 DEBUG nova.compute.manager [None req-c016fc8f-3fdf-423c-8851-abc9022963b2 - - - - - -] [instance: a75f5a44-2ce8-42d7-97fb-0a18198794ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:45:27 compute-0 nova_compute[181978]: 2026-01-12 13:45:27.154 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:27 compute-0 podman[210581]: 2026-01-12 13:45:27.542359433 +0000 UTC m=+0.033776872 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:45:28 compute-0 nova_compute[181978]: 2026-01-12 13:45:28.788 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225513.7871454, 9828d316-7b89-422e-a561-fad4ab8d9a5a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:45:28 compute-0 nova_compute[181978]: 2026-01-12 13:45:28.788 181991 INFO nova.compute.manager [-] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] VM Stopped (Lifecycle Event)
Jan 12 13:45:28 compute-0 nova_compute[181978]: 2026-01-12 13:45:28.804 181991 DEBUG nova.compute.manager [None req-f8c1ce36-b315-44dc-8021-1ad455849339 - - - - - -] [instance: 9828d316-7b89-422e-a561-fad4ab8d9a5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:45:28 compute-0 nova_compute[181978]: 2026-01-12 13:45:28.808 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.458 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.458 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.471 181991 DEBUG nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.527 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.528 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.533 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.533 181991 INFO nova.compute.claims [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:45:31 compute-0 podman[210602]: 2026-01-12 13:45:31.543708853 +0000 UTC m=+0.035192027 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.650 181991 DEBUG nova.compute.provider_tree [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.668 181991 DEBUG nova.scheduler.client.report [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.683 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.684 181991 DEBUG nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.719 181991 DEBUG nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.719 181991 DEBUG nova.network.neutron [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.732 181991 INFO nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.742 181991 DEBUG nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.869 181991 DEBUG nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.869 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.870 181991 INFO nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Creating image(s)
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.870 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.870 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.871 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.881 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.923 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.923 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.924 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.933 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.973 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.974 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.990 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk 1073741824" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.991 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:31 compute-0 nova_compute[181978]: 2026-01-12 13:45:31.991 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.009 181991 DEBUG nova.policy [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.032 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.033 181991 DEBUG nova.virt.disk.api [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.033 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.073 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.074 181991 DEBUG nova.virt.disk.api [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.074 181991 DEBUG nova.objects.instance [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid a62d7bac-49cb-4c15-9480-f0966c234d04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.085 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.085 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Ensure instance console log exists: /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.086 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.086 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.087 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.155 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:32 compute-0 nova_compute[181978]: 2026-01-12 13:45:32.709 181991 DEBUG nova.network.neutron [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Successfully created port: 8163da9a-c7bd-4211-9987-5fa00984c726 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.314 181991 DEBUG nova.network.neutron [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Successfully updated port: 8163da9a-c7bd-4211-9987-5fa00984c726 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.327 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.328 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.328 181991 DEBUG nova.network.neutron [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.391 181991 DEBUG nova.compute.manager [req-b4004cb1-8866-465e-80d1-a6ed8d031c40 req-3a909550-6184-40a7-b161-0fc7d75826be 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-changed-8163da9a-c7bd-4211-9987-5fa00984c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.392 181991 DEBUG nova.compute.manager [req-b4004cb1-8866-465e-80d1-a6ed8d031c40 req-3a909550-6184-40a7-b161-0fc7d75826be 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Refreshing instance network info cache due to event network-changed-8163da9a-c7bd-4211-9987-5fa00984c726. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.392 181991 DEBUG oslo_concurrency.lockutils [req-b4004cb1-8866-465e-80d1-a6ed8d031c40 req-3a909550-6184-40a7-b161-0fc7d75826be 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.453 181991 DEBUG nova.network.neutron [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.809 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.935 181991 DEBUG nova.network.neutron [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updating instance_info_cache with network_info: [{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.949 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.949 181991 DEBUG nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Instance network_info: |[{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.950 181991 DEBUG oslo_concurrency.lockutils [req-b4004cb1-8866-465e-80d1-a6ed8d031c40 req-3a909550-6184-40a7-b161-0fc7d75826be 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.950 181991 DEBUG nova.network.neutron [req-b4004cb1-8866-465e-80d1-a6ed8d031c40 req-3a909550-6184-40a7-b161-0fc7d75826be 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Refreshing network info cache for port 8163da9a-c7bd-4211-9987-5fa00984c726 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.952 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Start _get_guest_xml network_info=[{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.956 181991 WARNING nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.962 181991 DEBUG nova.virt.libvirt.host [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.962 181991 DEBUG nova.virt.libvirt.host [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.965 181991 DEBUG nova.virt.libvirt.host [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.965 181991 DEBUG nova.virt.libvirt.host [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.965 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.965 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.966 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.966 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.966 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.966 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.966 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.967 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.967 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.967 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.967 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.967 181991 DEBUG nova.virt.hardware [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.970 181991 DEBUG nova.virt.libvirt.vif [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1699210219',display_name='tempest-TestNetworkBasicOps-server-1699210219',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1699210219',id=3,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa34emIksTW5XRTN2HjrF17Nrm01ikW/EcCvVD8j9em2yUPyxX5X112js/FIFFCkUdpCOqO4Zf2B4kJ5ygkQLAtD/X6vZ4bcCkTD4/RWf4dC8RvnasSQpYvRSqjYQ4YEQ==',key_name='tempest-TestNetworkBasicOps-1346800269',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-b9761sje',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:45:31Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a62d7bac-49cb-4c15-9480-f0966c234d04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.970 181991 DEBUG nova.network.os_vif_util [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.971 181991 DEBUG nova.network.os_vif_util [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:66:08,bridge_name='br-int',has_traffic_filtering=True,id=8163da9a-c7bd-4211-9987-5fa00984c726,network=Network(6047b481-74d8-4106-8233-64be950f9819),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8163da9a-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.971 181991 DEBUG nova.objects.instance [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid a62d7bac-49cb-4c15-9480-f0966c234d04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.982 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <uuid>a62d7bac-49cb-4c15-9480-f0966c234d04</uuid>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <name>instance-00000003</name>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-1699210219</nova:name>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:45:33</nova:creationTime>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:45:33 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:45:33 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:45:33 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:45:33 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:45:33 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:45:33 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:45:33 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:45:33 compute-0 nova_compute[181978]:         <nova:port uuid="8163da9a-c7bd-4211-9987-5fa00984c726">
Jan 12 13:45:33 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <system>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <entry name="serial">a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <entry name="uuid">a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     </system>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <os>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   </os>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <features>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   </features>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.config"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:db:66:08"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <target dev="tap8163da9a-c7"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log" append="off"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <video>
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     </video>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:45:33 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:45:33 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:45:33 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:45:33 compute-0 nova_compute[181978]: </domain>
Jan 12 13:45:33 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.983 181991 DEBUG nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Preparing to wait for external event network-vif-plugged-8163da9a-c7bd-4211-9987-5fa00984c726 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.984 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.984 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.984 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.984 181991 DEBUG nova.virt.libvirt.vif [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1699210219',display_name='tempest-TestNetworkBasicOps-server-1699210219',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1699210219',id=3,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa34emIksTW5XRTN2HjrF17Nrm01ikW/EcCvVD8j9em2yUPyxX5X112js/FIFFCkUdpCOqO4Zf2B4kJ5ygkQLAtD/X6vZ4bcCkTD4/RWf4dC8RvnasSQpYvRSqjYQ4YEQ==',key_name='tempest-TestNetworkBasicOps-1346800269',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-b9761sje',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:45:31Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a62d7bac-49cb-4c15-9480-f0966c234d04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.985 181991 DEBUG nova.network.os_vif_util [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.985 181991 DEBUG nova.network.os_vif_util [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:66:08,bridge_name='br-int',has_traffic_filtering=True,id=8163da9a-c7bd-4211-9987-5fa00984c726,network=Network(6047b481-74d8-4106-8233-64be950f9819),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8163da9a-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.985 181991 DEBUG os_vif [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:66:08,bridge_name='br-int',has_traffic_filtering=True,id=8163da9a-c7bd-4211-9987-5fa00984c726,network=Network(6047b481-74d8-4106-8233-64be950f9819),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8163da9a-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.986 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.986 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.986 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.988 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.988 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8163da9a-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.989 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8163da9a-c7, col_values=(('external_ids', {'iface-id': '8163da9a-c7bd-4211-9987-5fa00984c726', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:66:08', 'vm-uuid': 'a62d7bac-49cb-4c15-9480-f0966c234d04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.989 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:33 compute-0 NetworkManager[55211]: <info>  [1768225533.9905] manager: (tap8163da9a-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.991 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.993 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:33 compute-0 nova_compute[181978]: 2026-01-12 13:45:33.995 181991 INFO os_vif [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:66:08,bridge_name='br-int',has_traffic_filtering=True,id=8163da9a-c7bd-4211-9987-5fa00984c726,network=Network(6047b481-74d8-4106-8233-64be950f9819),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8163da9a-c7')
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.030 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.030 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.030 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:db:66:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.031 181991 INFO nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Using config drive
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.275 181991 INFO nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Creating config drive at /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.config
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.279 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp__itk80 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.396 181991 DEBUG oslo_concurrency.processutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp__itk80" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:45:34 compute-0 kernel: tap8163da9a-c7: entered promiscuous mode
Jan 12 13:45:34 compute-0 ovn_controller[94974]: 2026-01-12T13:45:34Z|00055|binding|INFO|Claiming lport 8163da9a-c7bd-4211-9987-5fa00984c726 for this chassis.
Jan 12 13:45:34 compute-0 ovn_controller[94974]: 2026-01-12T13:45:34Z|00056|binding|INFO|8163da9a-c7bd-4211-9987-5fa00984c726: Claiming fa:16:3e:db:66:08 10.100.0.8
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.435 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:34 compute-0 NetworkManager[55211]: <info>  [1768225534.4373] manager: (tap8163da9a-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.442 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:66:08 10.100.0.8'], port_security=['fa:16:3e:db:66:08 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a62d7bac-49cb-4c15-9480-f0966c234d04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6047b481-74d8-4106-8233-64be950f9819', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5578ebe3-1d0d-4674-ad1c-b614da7395c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a076cb36-abd8-4ce2-9cbb-c6cc0abb14bd, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=8163da9a-c7bd-4211-9987-5fa00984c726) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.443 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 8163da9a-c7bd-4211-9987-5fa00984c726 in datapath 6047b481-74d8-4106-8233-64be950f9819 bound to our chassis
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.444 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6047b481-74d8-4106-8233-64be950f9819
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.451 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[34953634-979e-4b38-9d3d-b778132c54fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.451 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6047b481-71 in ovnmeta-6047b481-74d8-4106-8233-64be950f9819 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.452 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6047b481-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.452 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[9d4f9292-87f5-4cf8-acfe-f36a1d67b9fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.453 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[db11f633-8ac4-4b9e-ab91-b88f06a4bb72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.460 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[3110aa8e-955b-4c31-82bf-d5189af5fe0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 systemd-udevd[210654]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:45:34 compute-0 systemd-machined[153581]: New machine qemu-3-instance-00000003.
Jan 12 13:45:34 compute-0 NetworkManager[55211]: <info>  [1768225534.4747] device (tap8163da9a-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:45:34 compute-0 NetworkManager[55211]: <info>  [1768225534.4754] device (tap8163da9a-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:45:34 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.481 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e17623de-426a-476b-8470-ac333337cad2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.494 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:34 compute-0 ovn_controller[94974]: 2026-01-12T13:45:34Z|00057|binding|INFO|Setting lport 8163da9a-c7bd-4211-9987-5fa00984c726 ovn-installed in OVS
Jan 12 13:45:34 compute-0 ovn_controller[94974]: 2026-01-12T13:45:34Z|00058|binding|INFO|Setting lport 8163da9a-c7bd-4211-9987-5fa00984c726 up in Southbound
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.500 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a42ad8-3bc0-44fe-b3b2-109141c78269]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.501 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:34 compute-0 NetworkManager[55211]: <info>  [1768225534.5061] manager: (tap6047b481-70): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.505 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[aec36f39-3dde-4fa2-a952-2159f41d01c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.528 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[f18049cd-1b53-4bfc-bc4c-f871cc7b66af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.531 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[349c2c6a-17e6-461b-9669-eb2e6d6217c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 NetworkManager[55211]: <info>  [1768225534.5444] device (tap6047b481-70): carrier: link connected
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.547 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[804ab55d-32d4-4097-8eaa-a627f368e09f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.557 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ad119378-e909-45a2-bc1f-aa94c0c00960]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6047b481-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:36:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 260057, 'reachable_time': 43122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210678, 'error': None, 'target': 'ovnmeta-6047b481-74d8-4106-8233-64be950f9819', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.565 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1269315b-f4d3-4acb-9bbc-049b4a3e0f5e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:3682'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 260057, 'tstamp': 260057}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210679, 'error': None, 'target': 'ovnmeta-6047b481-74d8-4106-8233-64be950f9819', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.574 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[97cb93bf-d7ac-4cd0-ad57-62bef0e992d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6047b481-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:36:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 260057, 'reachable_time': 43122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210680, 'error': None, 'target': 'ovnmeta-6047b481-74d8-4106-8233-64be950f9819', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.591 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b74a03-06fa-4bde-895c-943aa7e5538a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.620 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ae7e2766-10cc-4f09-b799-fffcfb117692]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.621 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6047b481-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.621 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.622 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6047b481-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.623 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:34 compute-0 NetworkManager[55211]: <info>  [1768225534.6241] manager: (tap6047b481-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 12 13:45:34 compute-0 kernel: tap6047b481-70: entered promiscuous mode
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.625 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.627 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6047b481-70, col_values=(('external_ids', {'iface-id': 'e064259b-0a90-4a58-b36e-6946c46982b0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.628 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.628 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:34 compute-0 ovn_controller[94974]: 2026-01-12T13:45:34Z|00059|binding|INFO|Releasing lport e064259b-0a90-4a58-b36e-6946c46982b0 from this chassis (sb_readonly=0)
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.629 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6047b481-74d8-4106-8233-64be950f9819.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6047b481-74d8-4106-8233-64be950f9819.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.630 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8a2013-94b0-4b54-b506-ab48f90632ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.630 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-6047b481-74d8-4106-8233-64be950f9819
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/6047b481-74d8-4106-8233-64be950f9819.pid.haproxy
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID 6047b481-74d8-4106-8233-64be950f9819
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:45:34 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:34.632 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6047b481-74d8-4106-8233-64be950f9819', 'env', 'PROCESS_TAG=haproxy-6047b481-74d8-4106-8233-64be950f9819', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6047b481-74d8-4106-8233-64be950f9819.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.640 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.804 181991 DEBUG nova.network.neutron [req-b4004cb1-8866-465e-80d1-a6ed8d031c40 req-3a909550-6184-40a7-b161-0fc7d75826be 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updated VIF entry in instance network info cache for port 8163da9a-c7bd-4211-9987-5fa00984c726. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.805 181991 DEBUG nova.network.neutron [req-b4004cb1-8866-465e-80d1-a6ed8d031c40 req-3a909550-6184-40a7-b161-0fc7d75826be 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updating instance_info_cache with network_info: [{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:45:34 compute-0 nova_compute[181978]: 2026-01-12 13:45:34.829 181991 DEBUG oslo_concurrency.lockutils [req-b4004cb1-8866-465e-80d1-a6ed8d031c40 req-3a909550-6184-40a7-b161-0fc7d75826be 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:45:34 compute-0 podman[210708]: 2026-01-12 13:45:34.895698693 +0000 UTC m=+0.026459599 container create 3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:45:34 compute-0 systemd[1]: Started libpod-conmon-3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84.scope.
Jan 12 13:45:34 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:45:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17b8c83e58caebf8c166e5c9cc224e9d9749b764ca37a9d717153affb3ff19a1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:45:34 compute-0 podman[210708]: 2026-01-12 13:45:34.949745629 +0000 UTC m=+0.080506525 container init 3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 12 13:45:34 compute-0 podman[210708]: 2026-01-12 13:45:34.953607877 +0000 UTC m=+0.084368773 container start 3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 12 13:45:34 compute-0 podman[210708]: 2026-01-12 13:45:34.883735233 +0000 UTC m=+0.014496139 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:45:34 compute-0 neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819[210720]: [NOTICE]   (210724) : New worker (210726) forked
Jan 12 13:45:34 compute-0 neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819[210720]: [NOTICE]   (210724) : Loading success.
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.160 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225535.1599202, a62d7bac-49cb-4c15-9480-f0966c234d04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.160 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] VM Started (Lifecycle Event)
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.176 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.178 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225535.1623924, a62d7bac-49cb-4c15-9480-f0966c234d04 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.178 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] VM Paused (Lifecycle Event)
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.189 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.191 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.202 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.462 181991 DEBUG nova.compute.manager [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-plugged-8163da9a-c7bd-4211-9987-5fa00984c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.462 181991 DEBUG oslo_concurrency.lockutils [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.462 181991 DEBUG oslo_concurrency.lockutils [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.462 181991 DEBUG oslo_concurrency.lockutils [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.462 181991 DEBUG nova.compute.manager [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Processing event network-vif-plugged-8163da9a-c7bd-4211-9987-5fa00984c726 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.463 181991 DEBUG nova.compute.manager [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-plugged-8163da9a-c7bd-4211-9987-5fa00984c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.463 181991 DEBUG oslo_concurrency.lockutils [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.463 181991 DEBUG oslo_concurrency.lockutils [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.463 181991 DEBUG oslo_concurrency.lockutils [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.463 181991 DEBUG nova.compute.manager [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] No waiting events found dispatching network-vif-plugged-8163da9a-c7bd-4211-9987-5fa00984c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.463 181991 WARNING nova.compute.manager [req-b6402f2e-c367-4029-ae4d-3be384240c03 req-850d61de-f612-481d-ad3f-0f713e7876b3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received unexpected event network-vif-plugged-8163da9a-c7bd-4211-9987-5fa00984c726 for instance with vm_state building and task_state spawning.
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.464 181991 DEBUG nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.465 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225535.4658365, a62d7bac-49cb-4c15-9480-f0966c234d04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.466 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] VM Resumed (Lifecycle Event)
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.540 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.541 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.544 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.545 181991 INFO nova.virt.libvirt.driver [-] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Instance spawned successfully.
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.546 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.562 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.562 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.563 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.563 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.563 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.563 181991 DEBUG nova.virt.libvirt.driver [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.566 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.601 181991 INFO nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Took 3.73 seconds to spawn the instance on the hypervisor.
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.602 181991 DEBUG nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.640 181991 INFO nova.compute.manager [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Took 4.14 seconds to build instance.
Jan 12 13:45:35 compute-0 nova_compute[181978]: 2026-01-12 13:45:35.650 181991 DEBUG oslo_concurrency.lockutils [None req-2e51a449-01b2-4168-ad65-1110ce92374b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:37 compute-0 nova_compute[181978]: 2026-01-12 13:45:37.156 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:38 compute-0 nova_compute[181978]: 2026-01-12 13:45:38.989 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:39 compute-0 NetworkManager[55211]: <info>  [1768225539.1321] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 12 13:45:39 compute-0 NetworkManager[55211]: <info>  [1768225539.1327] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 12 13:45:39 compute-0 nova_compute[181978]: 2026-01-12 13:45:39.132 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:39 compute-0 ovn_controller[94974]: 2026-01-12T13:45:39Z|00060|binding|INFO|Releasing lport e064259b-0a90-4a58-b36e-6946c46982b0 from this chassis (sb_readonly=0)
Jan 12 13:45:39 compute-0 ovn_controller[94974]: 2026-01-12T13:45:39Z|00061|binding|INFO|Releasing lport e064259b-0a90-4a58-b36e-6946c46982b0 from this chassis (sb_readonly=0)
Jan 12 13:45:39 compute-0 nova_compute[181978]: 2026-01-12 13:45:39.168 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:39 compute-0 nova_compute[181978]: 2026-01-12 13:45:39.172 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:39 compute-0 nova_compute[181978]: 2026-01-12 13:45:39.470 181991 DEBUG nova.compute.manager [req-d5dc068f-5edd-41f5-9126-09e44cba37eb req-1b25d5ce-9079-4112-85e5-451510e6559d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-changed-8163da9a-c7bd-4211-9987-5fa00984c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:39 compute-0 nova_compute[181978]: 2026-01-12 13:45:39.470 181991 DEBUG nova.compute.manager [req-d5dc068f-5edd-41f5-9126-09e44cba37eb req-1b25d5ce-9079-4112-85e5-451510e6559d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Refreshing instance network info cache due to event network-changed-8163da9a-c7bd-4211-9987-5fa00984c726. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:45:39 compute-0 nova_compute[181978]: 2026-01-12 13:45:39.470 181991 DEBUG oslo_concurrency.lockutils [req-d5dc068f-5edd-41f5-9126-09e44cba37eb req-1b25d5ce-9079-4112-85e5-451510e6559d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:45:39 compute-0 nova_compute[181978]: 2026-01-12 13:45:39.471 181991 DEBUG oslo_concurrency.lockutils [req-d5dc068f-5edd-41f5-9126-09e44cba37eb req-1b25d5ce-9079-4112-85e5-451510e6559d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:45:39 compute-0 nova_compute[181978]: 2026-01-12 13:45:39.471 181991 DEBUG nova.network.neutron [req-d5dc068f-5edd-41f5-9126-09e44cba37eb req-1b25d5ce-9079-4112-85e5-451510e6559d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Refreshing network info cache for port 8163da9a-c7bd-4211-9987-5fa00984c726 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:45:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:40.199 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:40.199 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:40.200 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:45:40 compute-0 podman[210739]: 2026-01-12 13:45:40.557419695 +0000 UTC m=+0.053963822 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 12 13:45:41 compute-0 nova_compute[181978]: 2026-01-12 13:45:41.416 181991 DEBUG nova.network.neutron [req-d5dc068f-5edd-41f5-9126-09e44cba37eb req-1b25d5ce-9079-4112-85e5-451510e6559d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updated VIF entry in instance network info cache for port 8163da9a-c7bd-4211-9987-5fa00984c726. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:45:41 compute-0 nova_compute[181978]: 2026-01-12 13:45:41.417 181991 DEBUG nova.network.neutron [req-d5dc068f-5edd-41f5-9126-09e44cba37eb req-1b25d5ce-9079-4112-85e5-451510e6559d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updating instance_info_cache with network_info: [{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:45:41 compute-0 nova_compute[181978]: 2026-01-12 13:45:41.430 181991 DEBUG oslo_concurrency.lockutils [req-d5dc068f-5edd-41f5-9126-09e44cba37eb req-1b25d5ce-9079-4112-85e5-451510e6559d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:45:41 compute-0 podman[210763]: 2026-01-12 13:45:41.543997096 +0000 UTC m=+0.039542703 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 12 13:45:41 compute-0 podman[210764]: 2026-01-12 13:45:41.557488286 +0000 UTC m=+0.051366450 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 12 13:45:42 compute-0 nova_compute[181978]: 2026-01-12 13:45:42.159 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:43 compute-0 nova_compute[181978]: 2026-01-12 13:45:43.991 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:45 compute-0 ovn_controller[94974]: 2026-01-12T13:45:45Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:66:08 10.100.0.8
Jan 12 13:45:45 compute-0 ovn_controller[94974]: 2026-01-12T13:45:45Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:66:08 10.100.0.8
Jan 12 13:45:47 compute-0 nova_compute[181978]: 2026-01-12 13:45:47.159 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:48 compute-0 podman[210814]: 2026-01-12 13:45:48.567674719 +0000 UTC m=+0.063361991 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 12 13:45:48 compute-0 nova_compute[181978]: 2026-01-12 13:45:48.993 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:51 compute-0 nova_compute[181978]: 2026-01-12 13:45:51.772 181991 INFO nova.compute.manager [None req-a517e25f-61b1-42c4-a657-863acb1e39b0 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Get console output
Jan 12 13:45:51 compute-0 nova_compute[181978]: 2026-01-12 13:45:51.775 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:45:52 compute-0 nova_compute[181978]: 2026-01-12 13:45:52.160 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:53 compute-0 nova_compute[181978]: 2026-01-12 13:45:53.994 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:54 compute-0 nova_compute[181978]: 2026-01-12 13:45:54.346 181991 DEBUG oslo_concurrency.lockutils [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "interface-a62d7bac-49cb-4c15-9480-f0966c234d04-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:45:54 compute-0 nova_compute[181978]: 2026-01-12 13:45:54.346 181991 DEBUG oslo_concurrency.lockutils [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "interface-a62d7bac-49cb-4c15-9480-f0966c234d04-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:45:54 compute-0 nova_compute[181978]: 2026-01-12 13:45:54.347 181991 DEBUG nova.objects.instance [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'flavor' on Instance uuid a62d7bac-49cb-4c15-9480-f0966c234d04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:45:54 compute-0 nova_compute[181978]: 2026-01-12 13:45:54.578 181991 DEBUG nova.objects.instance [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_requests' on Instance uuid a62d7bac-49cb-4c15-9480-f0966c234d04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:45:54 compute-0 nova_compute[181978]: 2026-01-12 13:45:54.590 181991 DEBUG nova.network.neutron [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:45:54 compute-0 nova_compute[181978]: 2026-01-12 13:45:54.721 181991 DEBUG nova.policy [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:45:55 compute-0 nova_compute[181978]: 2026-01-12 13:45:55.106 181991 DEBUG nova.network.neutron [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Successfully created port: 943f8d68-bb52-4293-98db-563f4e14df6e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:45:57 compute-0 nova_compute[181978]: 2026-01-12 13:45:57.162 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:58 compute-0 podman[210830]: 2026-01-12 13:45:58.541394518 +0000 UTC m=+0.036957345 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 12 13:45:58 compute-0 nova_compute[181978]: 2026-01-12 13:45:58.852 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:58 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:58.853 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:45:58 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:45:58.854 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:45:58 compute-0 nova_compute[181978]: 2026-01-12 13:45:58.995 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:45:59 compute-0 nova_compute[181978]: 2026-01-12 13:45:59.009 181991 DEBUG nova.network.neutron [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Successfully updated port: 943f8d68-bb52-4293-98db-563f4e14df6e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:45:59 compute-0 nova_compute[181978]: 2026-01-12 13:45:59.024 181991 DEBUG oslo_concurrency.lockutils [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:45:59 compute-0 nova_compute[181978]: 2026-01-12 13:45:59.024 181991 DEBUG oslo_concurrency.lockutils [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:45:59 compute-0 nova_compute[181978]: 2026-01-12 13:45:59.024 181991 DEBUG nova.network.neutron [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:45:59 compute-0 nova_compute[181978]: 2026-01-12 13:45:59.094 181991 DEBUG nova.compute.manager [req-b28e2ff0-a789-4271-89b6-bcea39192c8e req-5c3eb850-6347-44d6-8b86-ee4b72c57e5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-changed-943f8d68-bb52-4293-98db-563f4e14df6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:45:59 compute-0 nova_compute[181978]: 2026-01-12 13:45:59.094 181991 DEBUG nova.compute.manager [req-b28e2ff0-a789-4271-89b6-bcea39192c8e req-5c3eb850-6347-44d6-8b86-ee4b72c57e5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Refreshing instance network info cache due to event network-changed-943f8d68-bb52-4293-98db-563f4e14df6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:45:59 compute-0 nova_compute[181978]: 2026-01-12 13:45:59.095 181991 DEBUG oslo_concurrency.lockutils [req-b28e2ff0-a789-4271-89b6-bcea39192c8e req-5c3eb850-6347-44d6-8b86-ee4b72c57e5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.164 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.209 181991 DEBUG nova.network.neutron [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updating instance_info_cache with network_info: [{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.224 181991 DEBUG oslo_concurrency.lockutils [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.225 181991 DEBUG oslo_concurrency.lockutils [req-b28e2ff0-a789-4271-89b6-bcea39192c8e req-5c3eb850-6347-44d6-8b86-ee4b72c57e5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.225 181991 DEBUG nova.network.neutron [req-b28e2ff0-a789-4271-89b6-bcea39192c8e req-5c3eb850-6347-44d6-8b86-ee4b72c57e5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Refreshing network info cache for port 943f8d68-bb52-4293-98db-563f4e14df6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.227 181991 DEBUG nova.virt.libvirt.vif [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1699210219',display_name='tempest-TestNetworkBasicOps-server-1699210219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1699210219',id=3,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa34emIksTW5XRTN2HjrF17Nrm01ikW/EcCvVD8j9em2yUPyxX5X112js/FIFFCkUdpCOqO4Zf2B4kJ5ygkQLAtD/X6vZ4bcCkTD4/RWf4dC8RvnasSQpYvRSqjYQ4YEQ==',key_name='tempest-TestNetworkBasicOps-1346800269',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:45:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-b9761sje',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:45:35Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a62d7bac-49cb-4c15-9480-f0966c234d04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.227 181991 DEBUG nova.network.os_vif_util [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.228 181991 DEBUG nova.network.os_vif_util [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.228 181991 DEBUG os_vif [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.228 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.228 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.229 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.230 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.231 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap943f8d68-bb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.231 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap943f8d68-bb, col_values=(('external_ids', {'iface-id': '943f8d68-bb52-4293-98db-563f4e14df6e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:af:c1', 'vm-uuid': 'a62d7bac-49cb-4c15-9480-f0966c234d04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.232 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 NetworkManager[55211]: <info>  [1768225562.2331] manager: (tap943f8d68-bb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.234 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.237 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.237 181991 INFO os_vif [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb')
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.238 181991 DEBUG nova.virt.libvirt.vif [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1699210219',display_name='tempest-TestNetworkBasicOps-server-1699210219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1699210219',id=3,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa34emIksTW5XRTN2HjrF17Nrm01ikW/EcCvVD8j9em2yUPyxX5X112js/FIFFCkUdpCOqO4Zf2B4kJ5ygkQLAtD/X6vZ4bcCkTD4/RWf4dC8RvnasSQpYvRSqjYQ4YEQ==',key_name='tempest-TestNetworkBasicOps-1346800269',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:45:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-b9761sje',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:45:35Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a62d7bac-49cb-4c15-9480-f0966c234d04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.238 181991 DEBUG nova.network.os_vif_util [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.239 181991 DEBUG nova.network.os_vif_util [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.240 181991 DEBUG nova.virt.libvirt.guest [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] attach device xml: <interface type="ethernet">
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <mac address="fa:16:3e:71:af:c1"/>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <model type="virtio"/>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <mtu size="1442"/>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <target dev="tap943f8d68-bb"/>
Jan 12 13:46:02 compute-0 nova_compute[181978]: </interface>
Jan 12 13:46:02 compute-0 nova_compute[181978]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 12 13:46:02 compute-0 kernel: tap943f8d68-bb: entered promiscuous mode
Jan 12 13:46:02 compute-0 NetworkManager[55211]: <info>  [1768225562.2469] manager: (tap943f8d68-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Jan 12 13:46:02 compute-0 ovn_controller[94974]: 2026-01-12T13:46:02Z|00062|binding|INFO|Claiming lport 943f8d68-bb52-4293-98db-563f4e14df6e for this chassis.
Jan 12 13:46:02 compute-0 ovn_controller[94974]: 2026-01-12T13:46:02Z|00063|binding|INFO|943f8d68-bb52-4293-98db-563f4e14df6e: Claiming fa:16:3e:71:af:c1 10.100.0.20
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.253 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.258 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:af:c1 10.100.0.20'], port_security=['fa:16:3e:71:af:c1 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'a62d7bac-49cb-4c15-9480-f0966c234d04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-231df9f5-d6a0-4997-b8de-325b6b48c797', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e7848670-66d3-47c2-aa04-0080edfddbef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9122ba4b-fdad-46f0-aa94-ea9942950307, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=943f8d68-bb52-4293-98db-563f4e14df6e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.259 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 943f8d68-bb52-4293-98db-563f4e14df6e in datapath 231df9f5-d6a0-4997-b8de-325b6b48c797 bound to our chassis
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.260 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 231df9f5-d6a0-4997-b8de-325b6b48c797
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.268 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[16e60669-913e-4c69-a51f-1d76e970b04b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.269 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap231df9f5-d1 in ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.270 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap231df9f5-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.270 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ecc6c1-9d25-44a9-8a6e-f082f6ec9e25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.270 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[721b92ce-ffdf-4900-b6f4-ce761b9d445e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 systemd-udevd[210864]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.286 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[8cd212dc-b0ae-4c29-83c2-863841c27189]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.292 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 ovn_controller[94974]: 2026-01-12T13:46:02Z|00064|binding|INFO|Setting lport 943f8d68-bb52-4293-98db-563f4e14df6e ovn-installed in OVS
Jan 12 13:46:02 compute-0 ovn_controller[94974]: 2026-01-12T13:46:02Z|00065|binding|INFO|Setting lport 943f8d68-bb52-4293-98db-563f4e14df6e up in Southbound
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.294 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 NetworkManager[55211]: <info>  [1768225562.2980] device (tap943f8d68-bb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:46:02 compute-0 NetworkManager[55211]: <info>  [1768225562.2987] device (tap943f8d68-bb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.297 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b67a2a-47c0-42a1-926d-465274a5f1d9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.314 181991 DEBUG nova.virt.libvirt.driver [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.314 181991 DEBUG nova.virt.libvirt.driver [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.315 181991 DEBUG nova.virt.libvirt.driver [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:db:66:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.315 181991 DEBUG nova.virt.libvirt.driver [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:71:af:c1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.324 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[1015aee2-2c80-4bdc-a688-c84c5dca8873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 NetworkManager[55211]: <info>  [1768225562.3304] manager: (tap231df9f5-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.330 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[4fea6625-129a-471e-9935-7e62b214f9a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.339 181991 DEBUG nova.virt.libvirt.guest [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1699210219</nova:name>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:46:02</nova:creationTime>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:46:02 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     <nova:port uuid="8163da9a-c7bd-4211-9987-5fa00984c726">
Jan 12 13:46:02 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     <nova:port uuid="943f8d68-bb52-4293-98db-563f4e14df6e">
Jan 12 13:46:02 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 12 13:46:02 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:02 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:46:02 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:46:02 compute-0 nova_compute[181978]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 12 13:46:02 compute-0 podman[210856]: 2026-01-12 13:46:02.344847953 +0000 UTC m=+0.077172772 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.357 181991 DEBUG oslo_concurrency.lockutils [None req-6d039084-406d-4d49-bec0-2bdd47a5eb5b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "interface-a62d7bac-49cb-4c15-9480-f0966c234d04-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.358 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ea58ab-802a-4684-ba1e-95f5cc6779bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.360 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[de41fbd0-3ed0-4440-8628-a5def5648979]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 NetworkManager[55211]: <info>  [1768225562.3738] device (tap231df9f5-d0): carrier: link connected
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.377 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[ae6c144f-04bb-478e-b81c-6260293c179d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.388 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[398a1ca3-70fc-4512-9157-5ae23bbfa12e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap231df9f5-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:3f:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 262840, 'reachable_time': 20647, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210896, 'error': None, 'target': 'ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.399 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[77d76e86-b737-44de-b7aa-a06220205f9d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:3fc5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 262840, 'tstamp': 262840}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210897, 'error': None, 'target': 'ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.409 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[14b9becf-a56a-4792-922e-e80c27f97b6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap231df9f5-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:3f:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 262840, 'reachable_time': 20647, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210898, 'error': None, 'target': 'ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.428 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ac009caa-3e63-4566-9d92-667a576e682e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.465 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[00c1fbd7-4ce0-4b74-b464-df8b5da51eb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.466 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap231df9f5-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.466 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.467 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap231df9f5-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.468 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 NetworkManager[55211]: <info>  [1768225562.4686] manager: (tap231df9f5-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 12 13:46:02 compute-0 kernel: tap231df9f5-d0: entered promiscuous mode
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.470 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.472 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap231df9f5-d0, col_values=(('external_ids', {'iface-id': 'd4a3911d-47c9-438b-9bf8-28213dc37774'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:02 compute-0 ovn_controller[94974]: 2026-01-12T13:46:02Z|00066|binding|INFO|Releasing lport d4a3911d-47c9-438b-9bf8-28213dc37774 from this chassis (sb_readonly=0)
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.473 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.475 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/231df9f5-d6a0-4997-b8de-325b6b48c797.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/231df9f5-d6a0-4997-b8de-325b6b48c797.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.475 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e3d6bd-be8b-4504-82a3-6027d3f09f8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.476 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-231df9f5-d6a0-4997-b8de-325b6b48c797
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/231df9f5-d6a0-4997-b8de-325b6b48c797.pid.haproxy
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID 231df9f5-d6a0-4997-b8de-325b6b48c797
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:46:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:02.477 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797', 'env', 'PROCESS_TAG=haproxy-231df9f5-d6a0-4997-b8de-325b6b48c797', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/231df9f5-d6a0-4997-b8de-325b6b48c797.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:46:02 compute-0 nova_compute[181978]: 2026-01-12 13:46:02.485 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:02 compute-0 podman[210926]: 2026-01-12 13:46:02.751147092 +0000 UTC m=+0.031145394 container create 76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 12 13:46:02 compute-0 systemd[1]: Started libpod-conmon-76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004.scope.
Jan 12 13:46:02 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:46:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c1b97287a1685bc9f596496ba4fcbd074f51214a13c4bfa59b887b681df3cb8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:46:02 compute-0 podman[210926]: 2026-01-12 13:46:02.807110349 +0000 UTC m=+0.087108671 container init 76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:46:02 compute-0 podman[210926]: 2026-01-12 13:46:02.811678844 +0000 UTC m=+0.091677145 container start 76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 12 13:46:02 compute-0 podman[210926]: 2026-01-12 13:46:02.737471986 +0000 UTC m=+0.017470307 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:46:02 compute-0 neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797[210938]: [NOTICE]   (210942) : New worker (210944) forked
Jan 12 13:46:02 compute-0 neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797[210938]: [NOTICE]   (210942) : Loading success.
Jan 12 13:46:03 compute-0 nova_compute[181978]: 2026-01-12 13:46:03.331 181991 DEBUG nova.compute.manager [req-451c1606-e838-4e8c-a58f-70dee96c570b req-0925278c-4f1f-4010-8b72-d0a9e08abb0d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-plugged-943f8d68-bb52-4293-98db-563f4e14df6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:03 compute-0 nova_compute[181978]: 2026-01-12 13:46:03.332 181991 DEBUG oslo_concurrency.lockutils [req-451c1606-e838-4e8c-a58f-70dee96c570b req-0925278c-4f1f-4010-8b72-d0a9e08abb0d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:03 compute-0 nova_compute[181978]: 2026-01-12 13:46:03.332 181991 DEBUG oslo_concurrency.lockutils [req-451c1606-e838-4e8c-a58f-70dee96c570b req-0925278c-4f1f-4010-8b72-d0a9e08abb0d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:03 compute-0 nova_compute[181978]: 2026-01-12 13:46:03.332 181991 DEBUG oslo_concurrency.lockutils [req-451c1606-e838-4e8c-a58f-70dee96c570b req-0925278c-4f1f-4010-8b72-d0a9e08abb0d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:03 compute-0 nova_compute[181978]: 2026-01-12 13:46:03.332 181991 DEBUG nova.compute.manager [req-451c1606-e838-4e8c-a58f-70dee96c570b req-0925278c-4f1f-4010-8b72-d0a9e08abb0d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] No waiting events found dispatching network-vif-plugged-943f8d68-bb52-4293-98db-563f4e14df6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:46:03 compute-0 nova_compute[181978]: 2026-01-12 13:46:03.332 181991 WARNING nova.compute.manager [req-451c1606-e838-4e8c-a58f-70dee96c570b req-0925278c-4f1f-4010-8b72-d0a9e08abb0d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received unexpected event network-vif-plugged-943f8d68-bb52-4293-98db-563f4e14df6e for instance with vm_state active and task_state None.
Jan 12 13:46:03 compute-0 ovn_controller[94974]: 2026-01-12T13:46:03Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:af:c1 10.100.0.20
Jan 12 13:46:03 compute-0 ovn_controller[94974]: 2026-01-12T13:46:03Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:af:c1 10.100.0.20
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.453 181991 DEBUG oslo_concurrency.lockutils [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "interface-a62d7bac-49cb-4c15-9480-f0966c234d04-943f8d68-bb52-4293-98db-563f4e14df6e" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.453 181991 DEBUG oslo_concurrency.lockutils [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "interface-a62d7bac-49cb-4c15-9480-f0966c234d04-943f8d68-bb52-4293-98db-563f4e14df6e" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.466 181991 DEBUG nova.objects.instance [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'flavor' on Instance uuid a62d7bac-49cb-4c15-9480-f0966c234d04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.480 181991 DEBUG nova.virt.libvirt.vif [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1699210219',display_name='tempest-TestNetworkBasicOps-server-1699210219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1699210219',id=3,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa34emIksTW5XRTN2HjrF17Nrm01ikW/EcCvVD8j9em2yUPyxX5X112js/FIFFCkUdpCOqO4Zf2B4kJ5ygkQLAtD/X6vZ4bcCkTD4/RWf4dC8RvnasSQpYvRSqjYQ4YEQ==',key_name='tempest-TestNetworkBasicOps-1346800269',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:45:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-b9761sje',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:45:35Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a62d7bac-49cb-4c15-9480-f0966c234d04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.480 181991 DEBUG nova.network.os_vif_util [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.480 181991 DEBUG nova.network.os_vif_util [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.482 181991 DEBUG nova.virt.libvirt.guest [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.483 181991 DEBUG nova.virt.libvirt.guest [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.485 181991 DEBUG nova.virt.libvirt.driver [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Attempting to detach device tap943f8d68-bb from instance a62d7bac-49cb-4c15-9480-f0966c234d04 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.485 181991 DEBUG nova.virt.libvirt.guest [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] detach device xml: <interface type="ethernet">
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <mac address="fa:16:3e:71:af:c1"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <model type="virtio"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <mtu size="1442"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <target dev="tap943f8d68-bb"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]: </interface>
Jan 12 13:46:04 compute-0 nova_compute[181978]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.488 181991 DEBUG nova.virt.libvirt.guest [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.490 181991 DEBUG nova.virt.libvirt.guest [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <name>instance-00000003</name>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <uuid>a62d7bac-49cb-4c15-9480-f0966c234d04</uuid>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1699210219</nova:name>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:46:02</nova:creationTime>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:port uuid="8163da9a-c7bd-4211-9987-5fa00984c726">
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:port uuid="943f8d68-bb52-4293-98db-563f4e14df6e">
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:46:04 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <memory unit='KiB'>131072</memory>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <vcpu placement='static'>1</vcpu>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <resource>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <partition>/machine</partition>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </resource>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <sysinfo type='smbios'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <system>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='manufacturer'>RDO</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='product'>OpenStack Compute</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='serial'>a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='uuid'>a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='family'>Virtual Machine</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </system>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <os>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <boot dev='hd'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <smbios mode='sysinfo'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </os>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <features>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <vmcoreinfo state='on'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </features>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <cpu mode='custom' match='exact' check='full'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <vendor>AMD</vendor>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='x2apic'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc-deadline'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='hypervisor'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc_adjust'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='vaes'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='spec-ctrl'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='stibp'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='ssbd'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='cmp_legacy'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='overflow-recov'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='succor'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='virt-ssbd'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='lbrv'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='tsc-scale'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='vmcb-clean'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='flushbyasid'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='pause-filter'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='pfthreshold'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='v-vmsave-vmload'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='vgif'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='svm'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='topoext'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='npt'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='nrip-save'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <clock offset='utc'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <timer name='pit' tickpolicy='delay'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <timer name='hpet' present='no'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <on_poweroff>destroy</on_poweroff>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <on_reboot>restart</on_reboot>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <on_crash>destroy</on_crash>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <disk type='file' device='disk'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk' index='2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <backingStore type='file' index='3'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:         <format type='raw'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:         <source file='/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:         <backingStore/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       </backingStore>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target dev='vda' bus='virtio'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='virtio-disk0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <disk type='file' device='cdrom'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <driver name='qemu' type='raw' cache='none'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.config' index='1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <backingStore/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target dev='sda' bus='sata'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <readonly/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='sata0-0-0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='0' model='pcie-root'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pcie.0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='1' port='0x10'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='2' port='0x11'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='3' port='0x12'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.3'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='4' port='0x13'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.4'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='5' port='0x14'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.5'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='6' port='0x15'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.6'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='7' port='0x16'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='8' port='0x17'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.8'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='9' port='0x18'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.9'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='10' port='0x19'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.10'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='11' port='0x1a'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.11'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='12' port='0x1b'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.12'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='13' port='0x1c'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.13'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='14' port='0x1d'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.14'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='15' port='0x1e'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.15'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='16' port='0x1f'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.16'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='17' port='0x20'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.17'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='18' port='0x21'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.18'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='19' port='0x22'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.19'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='20' port='0x23'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.20'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='21' port='0x24'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.21'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='22' port='0x25'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.22'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='23' port='0x26'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.23'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='24' port='0x27'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.24'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='25' port='0x28'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.25'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-pci-bridge'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.26'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='usb'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='sata' index='0'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='ide'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:db:66:08'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target dev='tap8163da9a-c7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='net0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:71:af:c1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target dev='tap943f8d68-bb'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='net1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <serial type='pty'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log' append='off'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target type='isa-serial' port='0'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:         <model name='isa-serial'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       </target>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <console type='pty' tty='/dev/pts/0'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log' append='off'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target type='serial' port='0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </console>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <input type='tablet' bus='usb'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='input0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='usb' bus='0' port='1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <input type='mouse' bus='ps2'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='input1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <input type='keyboard' bus='ps2'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='input2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <listen type='address' address='::0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <audio id='1' type='none'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <video>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model type='virtio' heads='1' primary='yes'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='video0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </video>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <watchdog model='itco' action='reset'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='watchdog0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </watchdog>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <memballoon model='virtio'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <stats period='10'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='balloon0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <rng model='virtio'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <backend model='random'>/dev/urandom</backend>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='rng0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <label>system_u:system_r:svirt_t:s0:c449,c908</label>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c449,c908</imagelabel>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <label>+107:+107</label>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <imagelabel>+107:+107</imagelabel>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:46:04 compute-0 nova_compute[181978]: </domain>
Jan 12 13:46:04 compute-0 nova_compute[181978]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.490 181991 INFO nova.virt.libvirt.driver [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully detached device tap943f8d68-bb from instance a62d7bac-49cb-4c15-9480-f0966c234d04 from the persistent domain config.
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.490 181991 DEBUG nova.virt.libvirt.driver [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] (1/8): Attempting to detach device tap943f8d68-bb with device alias net1 from instance a62d7bac-49cb-4c15-9480-f0966c234d04 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.491 181991 DEBUG nova.virt.libvirt.guest [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] detach device xml: <interface type="ethernet">
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <mac address="fa:16:3e:71:af:c1"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <model type="virtio"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <mtu size="1442"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <target dev="tap943f8d68-bb"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]: </interface>
Jan 12 13:46:04 compute-0 nova_compute[181978]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 12 13:46:04 compute-0 kernel: tap943f8d68-bb (unregistering): left promiscuous mode
Jan 12 13:46:04 compute-0 NetworkManager[55211]: <info>  [1768225564.5310] device (tap943f8d68-bb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.535 181991 DEBUG nova.virt.libvirt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Received event <DeviceRemovedEvent: 1768225564.5352948, a62d7bac-49cb-4c15-9480-f0966c234d04 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.537 181991 DEBUG nova.virt.libvirt.driver [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Start waiting for the detach event from libvirt for device tap943f8d68-bb with device alias net1 for instance a62d7bac-49cb-4c15-9480-f0966c234d04 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.537 181991 DEBUG nova.virt.libvirt.guest [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.538 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:04 compute-0 ovn_controller[94974]: 2026-01-12T13:46:04Z|00067|binding|INFO|Releasing lport 943f8d68-bb52-4293-98db-563f4e14df6e from this chassis (sb_readonly=0)
Jan 12 13:46:04 compute-0 ovn_controller[94974]: 2026-01-12T13:46:04Z|00068|binding|INFO|Setting lport 943f8d68-bb52-4293-98db-563f4e14df6e down in Southbound
Jan 12 13:46:04 compute-0 ovn_controller[94974]: 2026-01-12T13:46:04Z|00069|binding|INFO|Removing iface tap943f8d68-bb ovn-installed in OVS
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.541 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.542 181991 DEBUG nova.virt.libvirt.guest [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <name>instance-00000003</name>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <uuid>a62d7bac-49cb-4c15-9480-f0966c234d04</uuid>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1699210219</nova:name>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:46:02</nova:creationTime>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:port uuid="8163da9a-c7bd-4211-9987-5fa00984c726">
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:port uuid="943f8d68-bb52-4293-98db-563f4e14df6e">
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:46:04 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <memory unit='KiB'>131072</memory>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <vcpu placement='static'>1</vcpu>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <resource>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <partition>/machine</partition>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </resource>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <sysinfo type='smbios'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <system>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='manufacturer'>RDO</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='product'>OpenStack Compute</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='serial'>a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='uuid'>a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <entry name='family'>Virtual Machine</entry>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </system>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <os>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <boot dev='hd'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <smbios mode='sysinfo'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </os>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <features>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <vmcoreinfo state='on'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </features>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <cpu mode='custom' match='exact' check='full'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <vendor>AMD</vendor>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='x2apic'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc-deadline'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='hypervisor'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc_adjust'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='vaes'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='spec-ctrl'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='stibp'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='ssbd'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='cmp_legacy'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='overflow-recov'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='succor'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='virt-ssbd'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='lbrv'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='tsc-scale'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='vmcb-clean'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='flushbyasid'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='pause-filter'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='pfthreshold'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='v-vmsave-vmload'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='vgif'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='svm'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='require' name='topoext'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='npt'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='nrip-save'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <clock offset='utc'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <timer name='pit' tickpolicy='delay'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <timer name='hpet' present='no'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <on_poweroff>destroy</on_poweroff>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <on_reboot>restart</on_reboot>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <on_crash>destroy</on_crash>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <disk type='file' device='disk'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk' index='2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <backingStore type='file' index='3'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:         <format type='raw'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:         <source file='/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:         <backingStore/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       </backingStore>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target dev='vda' bus='virtio'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='virtio-disk0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <disk type='file' device='cdrom'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <driver name='qemu' type='raw' cache='none'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.config' index='1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <backingStore/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target dev='sda' bus='sata'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <readonly/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='sata0-0-0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='0' model='pcie-root'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pcie.0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='1' port='0x10'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='2' port='0x11'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='3' port='0x12'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.3'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='4' port='0x13'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.4'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='5' port='0x14'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.5'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='6' port='0x15'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.6'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='7' port='0x16'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='8' port='0x17'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.8'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='9' port='0x18'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.9'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='10' port='0x19'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.10'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='11' port='0x1a'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.11'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='12' port='0x1b'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.12'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='13' port='0x1c'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.13'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='14' port='0x1d'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.14'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='15' port='0x1e'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.15'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='16' port='0x1f'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.16'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='17' port='0x20'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.17'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='18' port='0x21'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.18'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='19' port='0x22'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.19'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='20' port='0x23'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.20'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='21' port='0x24'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.21'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='22' port='0x25'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.22'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='23' port='0x26'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.23'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='24' port='0x27'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.24'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target chassis='25' port='0x28'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.25'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model name='pcie-pci-bridge'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='pci.26'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='usb'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <controller type='sata' index='0'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='ide'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:db:66:08'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target dev='tap8163da9a-c7'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='net0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <serial type='pty'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log' append='off'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target type='isa-serial' port='0'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:         <model name='isa-serial'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       </target>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <console type='pty' tty='/dev/pts/0'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log' append='off'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <target type='serial' port='0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </console>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <input type='tablet' bus='usb'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='input0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='usb' bus='0' port='1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <input type='mouse' bus='ps2'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='input1'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <input type='keyboard' bus='ps2'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='input2'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <listen type='address' address='::0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <audio id='1' type='none'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <video>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <model type='virtio' heads='1' primary='yes'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='video0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </video>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <watchdog model='itco' action='reset'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='watchdog0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </watchdog>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <memballoon model='virtio'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <stats period='10'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='balloon0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <rng model='virtio'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <backend model='random'>/dev/urandom</backend>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <alias name='rng0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <label>system_u:system_r:svirt_t:s0:c449,c908</label>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c449,c908</imagelabel>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <label>+107:+107</label>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <imagelabel>+107:+107</imagelabel>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:46:04 compute-0 nova_compute[181978]: </domain>
Jan 12 13:46:04 compute-0 nova_compute[181978]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.542 181991 INFO nova.virt.libvirt.driver [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully detached device tap943f8d68-bb from instance a62d7bac-49cb-4c15-9480-f0966c234d04 from the live domain config.
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.543 181991 DEBUG nova.virt.libvirt.vif [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1699210219',display_name='tempest-TestNetworkBasicOps-server-1699210219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1699210219',id=3,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa34emIksTW5XRTN2HjrF17Nrm01ikW/EcCvVD8j9em2yUPyxX5X112js/FIFFCkUdpCOqO4Zf2B4kJ5ygkQLAtD/X6vZ4bcCkTD4/RWf4dC8RvnasSQpYvRSqjYQ4YEQ==',key_name='tempest-TestNetworkBasicOps-1346800269',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:45:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-b9761sje',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:45:35Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a62d7bac-49cb-4c15-9480-f0966c234d04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.543 181991 DEBUG nova.network.os_vif_util [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.543 181991 DEBUG nova.network.os_vif_util [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.544 181991 DEBUG os_vif [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.545 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.545 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap943f8d68-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.548 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.549 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.552 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:af:c1 10.100.0.20'], port_security=['fa:16:3e:71:af:c1 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'a62d7bac-49cb-4c15-9480-f0966c234d04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-231df9f5-d6a0-4997-b8de-325b6b48c797', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e7848670-66d3-47c2-aa04-0080edfddbef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9122ba4b-fdad-46f0-aa94-ea9942950307, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=943f8d68-bb52-4293-98db-563f4e14df6e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.553 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.554 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 943f8d68-bb52-4293-98db-563f4e14df6e in datapath 231df9f5-d6a0-4997-b8de-325b6b48c797 unbound from our chassis
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.554 181991 INFO os_vif [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb')
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.555 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 231df9f5-d6a0-4997-b8de-325b6b48c797, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.555 181991 DEBUG nova.virt.libvirt.guest [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1699210219</nova:name>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:46:04</nova:creationTime>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     <nova:port uuid="8163da9a-c7bd-4211-9987-5fa00984c726">
Jan 12 13:46:04 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 12 13:46:04 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:04 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:46:04 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:46:04 compute-0 nova_compute[181978]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.556 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e8266323-38c1-43d5-b8a9-b7fb4dcc4a65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.556 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797 namespace which is not needed anymore
Jan 12 13:46:04 compute-0 neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797[210938]: [NOTICE]   (210942) : haproxy version is 2.8.14-c23fe91
Jan 12 13:46:04 compute-0 neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797[210938]: [NOTICE]   (210942) : path to executable is /usr/sbin/haproxy
Jan 12 13:46:04 compute-0 neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797[210938]: [WARNING]  (210942) : Exiting Master process...
Jan 12 13:46:04 compute-0 neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797[210938]: [ALERT]    (210942) : Current worker (210944) exited with code 143 (Terminated)
Jan 12 13:46:04 compute-0 neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797[210938]: [WARNING]  (210942) : All workers exited. Exiting... (0)
Jan 12 13:46:04 compute-0 systemd[1]: libpod-76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004.scope: Deactivated successfully.
Jan 12 13:46:04 compute-0 conmon[210938]: conmon 76bda36050b5c0dfb13c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004.scope/container/memory.events
Jan 12 13:46:04 compute-0 podman[210967]: 2026-01-12 13:46:04.64895144 +0000 UTC m=+0.031403899 container died 76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:46:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004-userdata-shm.mount: Deactivated successfully.
Jan 12 13:46:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-5c1b97287a1685bc9f596496ba4fcbd074f51214a13c4bfa59b887b681df3cb8-merged.mount: Deactivated successfully.
Jan 12 13:46:04 compute-0 podman[210967]: 2026-01-12 13:46:04.667036933 +0000 UTC m=+0.049489394 container cleanup 76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:46:04 compute-0 systemd[1]: libpod-conmon-76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004.scope: Deactivated successfully.
Jan 12 13:46:04 compute-0 podman[210990]: 2026-01-12 13:46:04.704919656 +0000 UTC m=+0.023364234 container remove 76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.708 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e2921915-9710-4c12-beff-5fbb1a07cbe2]: (4, ('Mon Jan 12 01:46:04 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797 (76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004)\n76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004\nMon Jan 12 01:46:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797 (76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004)\n76bda36050b5c0dfb13ccee3b888cd8e7e180632606df7f27c4ecb84bdcd3004\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.711 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f507dc-cf48-4caf-b9a1-0ac00275b716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.712 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap231df9f5-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:04 compute-0 kernel: tap231df9f5-d0: left promiscuous mode
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.716 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:04 compute-0 nova_compute[181978]: 2026-01-12 13:46:04.726 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.729 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[dea62ecf-4d72-46de-bee4-b6efd8e69286]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.739 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbc9786-7854-472e-ac5a-e43e239a2498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.739 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[92424f86-f4c3-4578-a52b-2a632ad40ce0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.751 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e4015625-c55e-46bd-9542-f3fc75da32ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 262834, 'reachable_time': 30887, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211002, 'error': None, 'target': 'ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:04 compute-0 systemd[1]: run-netns-ovnmeta\x2d231df9f5\x2dd6a0\x2d4997\x2db8de\x2d325b6b48c797.mount: Deactivated successfully.
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.753 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-231df9f5-d6a0-4997-b8de-325b6b48c797 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:46:04 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:04.753 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[838e89af-45bd-40c7-a36d-0db2aafd6d7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.420 181991 DEBUG nova.compute.manager [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-plugged-943f8d68-bb52-4293-98db-563f4e14df6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.420 181991 DEBUG oslo_concurrency.lockutils [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.420 181991 DEBUG oslo_concurrency.lockutils [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.420 181991 DEBUG oslo_concurrency.lockutils [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.421 181991 DEBUG nova.compute.manager [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] No waiting events found dispatching network-vif-plugged-943f8d68-bb52-4293-98db-563f4e14df6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.421 181991 WARNING nova.compute.manager [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received unexpected event network-vif-plugged-943f8d68-bb52-4293-98db-563f4e14df6e for instance with vm_state active and task_state None.
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.421 181991 DEBUG nova.compute.manager [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-unplugged-943f8d68-bb52-4293-98db-563f4e14df6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.421 181991 DEBUG oslo_concurrency.lockutils [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.421 181991 DEBUG oslo_concurrency.lockutils [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.421 181991 DEBUG oslo_concurrency.lockutils [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.421 181991 DEBUG nova.compute.manager [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] No waiting events found dispatching network-vif-unplugged-943f8d68-bb52-4293-98db-563f4e14df6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:46:05 compute-0 nova_compute[181978]: 2026-01-12 13:46:05.422 181991 WARNING nova.compute.manager [req-d3003e8d-e6c0-46f6-a631-1ab4d14b905c req-389ee030-8885-4fa9-bb52-c1dc91cdeff5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received unexpected event network-vif-unplugged-943f8d68-bb52-4293-98db-563f4e14df6e for instance with vm_state active and task_state None.
Jan 12 13:46:05 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:05.855 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.216 181991 DEBUG nova.network.neutron [req-b28e2ff0-a789-4271-89b6-bcea39192c8e req-5c3eb850-6347-44d6-8b86-ee4b72c57e5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updated VIF entry in instance network info cache for port 943f8d68-bb52-4293-98db-563f4e14df6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.216 181991 DEBUG nova.network.neutron [req-b28e2ff0-a789-4271-89b6-bcea39192c8e req-5c3eb850-6347-44d6-8b86-ee4b72c57e5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updating instance_info_cache with network_info: [{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.228 181991 DEBUG oslo_concurrency.lockutils [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.229 181991 DEBUG oslo_concurrency.lockutils [req-b28e2ff0-a789-4271-89b6-bcea39192c8e req-5c3eb850-6347-44d6-8b86-ee4b72c57e5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.230 181991 DEBUG oslo_concurrency.lockutils [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.230 181991 DEBUG nova.network.neutron [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.281 181991 DEBUG nova.compute.manager [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-deleted-943f8d68-bb52-4293-98db-563f4e14df6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.281 181991 INFO nova.compute.manager [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Neutron deleted interface 943f8d68-bb52-4293-98db-563f4e14df6e; detaching it from the instance and deleting it from the info cache
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.282 181991 DEBUG nova.network.neutron [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updating instance_info_cache with network_info: [{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.299 181991 DEBUG nova.objects.instance [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lazy-loading 'system_metadata' on Instance uuid a62d7bac-49cb-4c15-9480-f0966c234d04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.336 181991 DEBUG nova.objects.instance [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lazy-loading 'flavor' on Instance uuid a62d7bac-49cb-4c15-9480-f0966c234d04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.367 181991 DEBUG nova.virt.libvirt.vif [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1699210219',display_name='tempest-TestNetworkBasicOps-server-1699210219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1699210219',id=3,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa34emIksTW5XRTN2HjrF17Nrm01ikW/EcCvVD8j9em2yUPyxX5X112js/FIFFCkUdpCOqO4Zf2B4kJ5ygkQLAtD/X6vZ4bcCkTD4/RWf4dC8RvnasSQpYvRSqjYQ4YEQ==',key_name='tempest-TestNetworkBasicOps-1346800269',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:45:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-b9761sje',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:45:35Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a62d7bac-49cb-4c15-9480-f0966c234d04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.368 181991 DEBUG nova.network.os_vif_util [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Converting VIF {"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.368 181991 DEBUG nova.network.os_vif_util [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.370 181991 DEBUG nova.virt.libvirt.guest [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.372 181991 DEBUG nova.virt.libvirt.guest [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <name>instance-00000003</name>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <uuid>a62d7bac-49cb-4c15-9480-f0966c234d04</uuid>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1699210219</nova:name>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:46:04</nova:creationTime>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:port uuid="8163da9a-c7bd-4211-9987-5fa00984c726">
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:46:06 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <memory unit='KiB'>131072</memory>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <vcpu placement='static'>1</vcpu>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <resource>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <partition>/machine</partition>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </resource>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <sysinfo type='smbios'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <system>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='manufacturer'>RDO</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='product'>OpenStack Compute</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='serial'>a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='uuid'>a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='family'>Virtual Machine</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </system>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <os>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <boot dev='hd'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <smbios mode='sysinfo'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </os>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <features>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <vmcoreinfo state='on'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </features>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <cpu mode='custom' match='exact' check='full'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <vendor>AMD</vendor>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='x2apic'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc-deadline'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='hypervisor'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc_adjust'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='vaes'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='spec-ctrl'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='stibp'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='ssbd'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='cmp_legacy'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='overflow-recov'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='succor'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='virt-ssbd'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='lbrv'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='tsc-scale'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='vmcb-clean'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='flushbyasid'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='pause-filter'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='pfthreshold'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='v-vmsave-vmload'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='vgif'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='svm'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='topoext'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='npt'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='nrip-save'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <clock offset='utc'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <timer name='pit' tickpolicy='delay'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <timer name='hpet' present='no'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <on_poweroff>destroy</on_poweroff>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <on_reboot>restart</on_reboot>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <on_crash>destroy</on_crash>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <disk type='file' device='disk'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk' index='2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <backingStore type='file' index='3'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:         <format type='raw'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:         <source file='/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:         <backingStore/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       </backingStore>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target dev='vda' bus='virtio'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='virtio-disk0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <disk type='file' device='cdrom'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <driver name='qemu' type='raw' cache='none'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.config' index='1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <backingStore/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target dev='sda' bus='sata'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <readonly/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='sata0-0-0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='0' model='pcie-root'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pcie.0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='1' port='0x10'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='2' port='0x11'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='3' port='0x12'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.3'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='4' port='0x13'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.4'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='5' port='0x14'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.5'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='6' port='0x15'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.6'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='7' port='0x16'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='8' port='0x17'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.8'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='9' port='0x18'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.9'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='10' port='0x19'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.10'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='11' port='0x1a'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.11'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='12' port='0x1b'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.12'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='13' port='0x1c'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.13'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='14' port='0x1d'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.14'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='15' port='0x1e'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.15'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='16' port='0x1f'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.16'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='17' port='0x20'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.17'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='18' port='0x21'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.18'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='19' port='0x22'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.19'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='20' port='0x23'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.20'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='21' port='0x24'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.21'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='22' port='0x25'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.22'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='23' port='0x26'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.23'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='24' port='0x27'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.24'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='25' port='0x28'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.25'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-pci-bridge'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.26'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='usb'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='sata' index='0'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='ide'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:db:66:08'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target dev='tap8163da9a-c7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='net0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <serial type='pty'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log' append='off'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target type='isa-serial' port='0'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:         <model name='isa-serial'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       </target>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <console type='pty' tty='/dev/pts/0'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log' append='off'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target type='serial' port='0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </console>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <input type='tablet' bus='usb'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='input0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='usb' bus='0' port='1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <input type='mouse' bus='ps2'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='input1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <input type='keyboard' bus='ps2'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='input2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <listen type='address' address='::0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <audio id='1' type='none'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <video>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model type='virtio' heads='1' primary='yes'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='video0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </video>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <watchdog model='itco' action='reset'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='watchdog0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </watchdog>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <memballoon model='virtio'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <stats period='10'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='balloon0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <rng model='virtio'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <backend model='random'>/dev/urandom</backend>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='rng0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <label>system_u:system_r:svirt_t:s0:c449,c908</label>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c449,c908</imagelabel>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <label>+107:+107</label>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <imagelabel>+107:+107</imagelabel>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:46:06 compute-0 nova_compute[181978]: </domain>
Jan 12 13:46:06 compute-0 nova_compute[181978]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.372 181991 DEBUG nova.virt.libvirt.guest [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.374 181991 DEBUG nova.virt.libvirt.guest [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:71:af:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap943f8d68-bb"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <name>instance-00000003</name>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <uuid>a62d7bac-49cb-4c15-9480-f0966c234d04</uuid>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1699210219</nova:name>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:46:04</nova:creationTime>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:port uuid="8163da9a-c7bd-4211-9987-5fa00984c726">
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:46:06 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <memory unit='KiB'>131072</memory>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <vcpu placement='static'>1</vcpu>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <resource>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <partition>/machine</partition>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </resource>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <sysinfo type='smbios'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <system>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='manufacturer'>RDO</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='product'>OpenStack Compute</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='serial'>a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='uuid'>a62d7bac-49cb-4c15-9480-f0966c234d04</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <entry name='family'>Virtual Machine</entry>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </system>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <os>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <boot dev='hd'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <smbios mode='sysinfo'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </os>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <features>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <vmcoreinfo state='on'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </features>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <cpu mode='custom' match='exact' check='full'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <vendor>AMD</vendor>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='x2apic'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc-deadline'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='hypervisor'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc_adjust'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='vaes'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='spec-ctrl'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='stibp'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='ssbd'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='cmp_legacy'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='overflow-recov'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='succor'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='virt-ssbd'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='lbrv'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='tsc-scale'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='vmcb-clean'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='flushbyasid'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='pause-filter'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='pfthreshold'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='v-vmsave-vmload'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='vgif'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='svm'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='require' name='topoext'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='npt'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='nrip-save'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <clock offset='utc'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <timer name='pit' tickpolicy='delay'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <timer name='hpet' present='no'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <on_poweroff>destroy</on_poweroff>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <on_reboot>restart</on_reboot>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <on_crash>destroy</on_crash>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <disk type='file' device='disk'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk' index='2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <backingStore type='file' index='3'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:         <format type='raw'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:         <source file='/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:         <backingStore/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       </backingStore>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target dev='vda' bus='virtio'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='virtio-disk0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <disk type='file' device='cdrom'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <driver name='qemu' type='raw' cache='none'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/disk.config' index='1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <backingStore/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target dev='sda' bus='sata'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <readonly/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='sata0-0-0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='0' model='pcie-root'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pcie.0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='1' port='0x10'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='2' port='0x11'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='3' port='0x12'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.3'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='4' port='0x13'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.4'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='5' port='0x14'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.5'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='6' port='0x15'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.6'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='7' port='0x16'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='8' port='0x17'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.8'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='9' port='0x18'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.9'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='10' port='0x19'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.10'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='11' port='0x1a'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.11'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='12' port='0x1b'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.12'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='13' port='0x1c'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.13'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='14' port='0x1d'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.14'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='15' port='0x1e'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.15'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='16' port='0x1f'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.16'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='17' port='0x20'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.17'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='18' port='0x21'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.18'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='19' port='0x22'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.19'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='20' port='0x23'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.20'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='21' port='0x24'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.21'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='22' port='0x25'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.22'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='23' port='0x26'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.23'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='24' port='0x27'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.24'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target chassis='25' port='0x28'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.25'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model name='pcie-pci-bridge'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='pci.26'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='usb'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <controller type='sata' index='0'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='ide'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:db:66:08'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target dev='tap8163da9a-c7'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='net0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <serial type='pty'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log' append='off'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target type='isa-serial' port='0'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:         <model name='isa-serial'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       </target>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <console type='pty' tty='/dev/pts/0'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04/console.log' append='off'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <target type='serial' port='0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </console>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <input type='tablet' bus='usb'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='input0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='usb' bus='0' port='1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <input type='mouse' bus='ps2'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='input1'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <input type='keyboard' bus='ps2'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='input2'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </input>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <listen type='address' address='::0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <audio id='1' type='none'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <video>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <model type='virtio' heads='1' primary='yes'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='video0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </video>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <watchdog model='itco' action='reset'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='watchdog0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </watchdog>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <memballoon model='virtio'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <stats period='10'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='balloon0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <rng model='virtio'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <backend model='random'>/dev/urandom</backend>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <alias name='rng0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <label>system_u:system_r:svirt_t:s0:c449,c908</label>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c449,c908</imagelabel>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <label>+107:+107</label>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <imagelabel>+107:+107</imagelabel>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:46:06 compute-0 nova_compute[181978]: </domain>
Jan 12 13:46:06 compute-0 nova_compute[181978]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.375 181991 WARNING nova.virt.libvirt.driver [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Detaching interface fa:16:3e:71:af:c1 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap943f8d68-bb' not found.
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.375 181991 DEBUG nova.virt.libvirt.vif [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1699210219',display_name='tempest-TestNetworkBasicOps-server-1699210219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1699210219',id=3,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa34emIksTW5XRTN2HjrF17Nrm01ikW/EcCvVD8j9em2yUPyxX5X112js/FIFFCkUdpCOqO4Zf2B4kJ5ygkQLAtD/X6vZ4bcCkTD4/RWf4dC8RvnasSQpYvRSqjYQ4YEQ==',key_name='tempest-TestNetworkBasicOps-1346800269',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:45:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-b9761sje',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:45:35Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a62d7bac-49cb-4c15-9480-f0966c234d04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.375 181991 DEBUG nova.network.os_vif_util [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Converting VIF {"id": "943f8d68-bb52-4293-98db-563f4e14df6e", "address": "fa:16:3e:71:af:c1", "network": {"id": "231df9f5-d6a0-4997-b8de-325b6b48c797", "bridge": "br-int", "label": "tempest-network-smoke--1546666320", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap943f8d68-bb", "ovs_interfaceid": "943f8d68-bb52-4293-98db-563f4e14df6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.376 181991 DEBUG nova.network.os_vif_util [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.376 181991 DEBUG os_vif [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.377 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.377 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap943f8d68-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.377 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.379 181991 INFO os_vif [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:af:c1,bridge_name='br-int',has_traffic_filtering=True,id=943f8d68-bb52-4293-98db-563f4e14df6e,network=Network(231df9f5-d6a0-4997-b8de-325b6b48c797),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap943f8d68-bb')
Jan 12 13:46:06 compute-0 nova_compute[181978]: 2026-01-12 13:46:06.379 181991 DEBUG nova.virt.libvirt.guest [req-9db7efd9-9839-4150-9ec6-a5ea21df400f req-6439926a-25f9-4bf9-97b7-edccb0eee319 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1699210219</nova:name>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:46:06</nova:creationTime>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     <nova:port uuid="8163da9a-c7bd-4211-9987-5fa00984c726">
Jan 12 13:46:06 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 12 13:46:06 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:46:06 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:46:06 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:46:06 compute-0 nova_compute[181978]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 12 13:46:07 compute-0 nova_compute[181978]: 2026-01-12 13:46:07.166 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:07 compute-0 nova_compute[181978]: 2026-01-12 13:46:07.492 181991 DEBUG nova.compute.manager [req-e84e6c44-3f37-48ef-86e9-a5e84fbf93c4 req-6f103d69-26fb-4a2d-a9fe-67df92bfd37e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-plugged-943f8d68-bb52-4293-98db-563f4e14df6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:07 compute-0 nova_compute[181978]: 2026-01-12 13:46:07.492 181991 DEBUG oslo_concurrency.lockutils [req-e84e6c44-3f37-48ef-86e9-a5e84fbf93c4 req-6f103d69-26fb-4a2d-a9fe-67df92bfd37e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:07 compute-0 nova_compute[181978]: 2026-01-12 13:46:07.492 181991 DEBUG oslo_concurrency.lockutils [req-e84e6c44-3f37-48ef-86e9-a5e84fbf93c4 req-6f103d69-26fb-4a2d-a9fe-67df92bfd37e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:07 compute-0 nova_compute[181978]: 2026-01-12 13:46:07.492 181991 DEBUG oslo_concurrency.lockutils [req-e84e6c44-3f37-48ef-86e9-a5e84fbf93c4 req-6f103d69-26fb-4a2d-a9fe-67df92bfd37e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:07 compute-0 nova_compute[181978]: 2026-01-12 13:46:07.493 181991 DEBUG nova.compute.manager [req-e84e6c44-3f37-48ef-86e9-a5e84fbf93c4 req-6f103d69-26fb-4a2d-a9fe-67df92bfd37e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] No waiting events found dispatching network-vif-plugged-943f8d68-bb52-4293-98db-563f4e14df6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:46:07 compute-0 nova_compute[181978]: 2026-01-12 13:46:07.493 181991 WARNING nova.compute.manager [req-e84e6c44-3f37-48ef-86e9-a5e84fbf93c4 req-6f103d69-26fb-4a2d-a9fe-67df92bfd37e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received unexpected event network-vif-plugged-943f8d68-bb52-4293-98db-563f4e14df6e for instance with vm_state active and task_state None.
Jan 12 13:46:07 compute-0 ovn_controller[94974]: 2026-01-12T13:46:07Z|00070|binding|INFO|Releasing lport e064259b-0a90-4a58-b36e-6946c46982b0 from this chassis (sb_readonly=0)
Jan 12 13:46:07 compute-0 nova_compute[181978]: 2026-01-12 13:46:07.537 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.412 181991 INFO nova.network.neutron [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Port 943f8d68-bb52-4293-98db-563f4e14df6e from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.412 181991 DEBUG nova.network.neutron [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updating instance_info_cache with network_info: [{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.425 181991 DEBUG oslo_concurrency.lockutils [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.438 181991 DEBUG oslo_concurrency.lockutils [None req-4854e232-dab4-4e7d-9090-af8a6f4027c3 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "interface-a62d7bac-49cb-4c15-9480-f0966c234d04-943f8d68-bb52-4293-98db-563f4e14df6e" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.984s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.923 181991 DEBUG nova.compute.manager [req-7b780061-8ae6-440d-8602-4fa2e4866f43 req-38501c31-f70f-45c6-9edd-87d24c9bbb3a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-changed-8163da9a-c7bd-4211-9987-5fa00984c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.924 181991 DEBUG nova.compute.manager [req-7b780061-8ae6-440d-8602-4fa2e4866f43 req-38501c31-f70f-45c6-9edd-87d24c9bbb3a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Refreshing instance network info cache due to event network-changed-8163da9a-c7bd-4211-9987-5fa00984c726. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.924 181991 DEBUG oslo_concurrency.lockutils [req-7b780061-8ae6-440d-8602-4fa2e4866f43 req-38501c31-f70f-45c6-9edd-87d24c9bbb3a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.924 181991 DEBUG oslo_concurrency.lockutils [req-7b780061-8ae6-440d-8602-4fa2e4866f43 req-38501c31-f70f-45c6-9edd-87d24c9bbb3a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.924 181991 DEBUG nova.network.neutron [req-7b780061-8ae6-440d-8602-4fa2e4866f43 req-38501c31-f70f-45c6-9edd-87d24c9bbb3a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Refreshing network info cache for port 8163da9a-c7bd-4211-9987-5fa00984c726 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.977 181991 DEBUG oslo_concurrency.lockutils [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.977 181991 DEBUG oslo_concurrency.lockutils [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.977 181991 DEBUG oslo_concurrency.lockutils [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.977 181991 DEBUG oslo_concurrency.lockutils [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.978 181991 DEBUG oslo_concurrency.lockutils [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.978 181991 INFO nova.compute.manager [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Terminating instance
Jan 12 13:46:08 compute-0 nova_compute[181978]: 2026-01-12 13:46:08.979 181991 DEBUG nova.compute.manager [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:46:09 compute-0 kernel: tap8163da9a-c7 (unregistering): left promiscuous mode
Jan 12 13:46:09 compute-0 NetworkManager[55211]: <info>  [1768225569.0038] device (tap8163da9a-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.012 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:09 compute-0 ovn_controller[94974]: 2026-01-12T13:46:09Z|00071|binding|INFO|Releasing lport 8163da9a-c7bd-4211-9987-5fa00984c726 from this chassis (sb_readonly=0)
Jan 12 13:46:09 compute-0 ovn_controller[94974]: 2026-01-12T13:46:09Z|00072|binding|INFO|Setting lport 8163da9a-c7bd-4211-9987-5fa00984c726 down in Southbound
Jan 12 13:46:09 compute-0 ovn_controller[94974]: 2026-01-12T13:46:09Z|00073|binding|INFO|Removing iface tap8163da9a-c7 ovn-installed in OVS
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.018 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:66:08 10.100.0.8'], port_security=['fa:16:3e:db:66:08 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a62d7bac-49cb-4c15-9480-f0966c234d04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6047b481-74d8-4106-8233-64be950f9819', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5578ebe3-1d0d-4674-ad1c-b614da7395c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a076cb36-abd8-4ce2-9cbb-c6cc0abb14bd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=8163da9a-c7bd-4211-9987-5fa00984c726) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.019 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 8163da9a-c7bd-4211-9987-5fa00984c726 in datapath 6047b481-74d8-4106-8233-64be950f9819 unbound from our chassis
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.020 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6047b481-74d8-4106-8233-64be950f9819, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.020 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.021 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3dce5e07-24df-4767-b749-c92876b19b28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.021 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6047b481-74d8-4106-8233-64be950f9819 namespace which is not needed anymore
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.035 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:09 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 12 13:46:09 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 11.243s CPU time.
Jan 12 13:46:09 compute-0 systemd-machined[153581]: Machine qemu-3-instance-00000003 terminated.
Jan 12 13:46:09 compute-0 neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819[210720]: [NOTICE]   (210724) : haproxy version is 2.8.14-c23fe91
Jan 12 13:46:09 compute-0 neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819[210720]: [NOTICE]   (210724) : path to executable is /usr/sbin/haproxy
Jan 12 13:46:09 compute-0 neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819[210720]: [WARNING]  (210724) : Exiting Master process...
Jan 12 13:46:09 compute-0 neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819[210720]: [ALERT]    (210724) : Current worker (210726) exited with code 143 (Terminated)
Jan 12 13:46:09 compute-0 neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819[210720]: [WARNING]  (210724) : All workers exited. Exiting... (0)
Jan 12 13:46:09 compute-0 systemd[1]: libpod-3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84.scope: Deactivated successfully.
Jan 12 13:46:09 compute-0 podman[211024]: 2026-01-12 13:46:09.116921132 +0000 UTC m=+0.034819598 container died 3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:46:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84-userdata-shm.mount: Deactivated successfully.
Jan 12 13:46:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-17b8c83e58caebf8c166e5c9cc224e9d9749b764ca37a9d717153affb3ff19a1-merged.mount: Deactivated successfully.
Jan 12 13:46:09 compute-0 podman[211024]: 2026-01-12 13:46:09.140208771 +0000 UTC m=+0.058107246 container cleanup 3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 12 13:46:09 compute-0 systemd[1]: libpod-conmon-3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84.scope: Deactivated successfully.
Jan 12 13:46:09 compute-0 podman[211047]: 2026-01-12 13:46:09.178636348 +0000 UTC m=+0.022794992 container remove 3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.185 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e8bc2a0c-01be-43fc-83e0-f135046a70ed]: (4, ('Mon Jan 12 01:46:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819 (3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84)\n3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84\nMon Jan 12 01:46:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6047b481-74d8-4106-8233-64be950f9819 (3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84)\n3cef86177f3bdcb93bd5d12379300e6244f009b8e15930754ee7119925404e84\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.186 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab6e530-f101-4b13-95c0-621ab90970f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.187 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6047b481-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.188 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:09 compute-0 kernel: tap6047b481-70: left promiscuous mode
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.204 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.205 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.206 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[42600d2d-e547-4f78-b0a6-ec2efa6265c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.214 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[896610af-9c4e-4aca-8f79-9dd812b2d6ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.215 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f9c5fb-52f2-40e5-9287-6018af88ef8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.224 181991 INFO nova.virt.libvirt.driver [-] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Instance destroyed successfully.
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.225 181991 DEBUG nova.objects.instance [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid a62d7bac-49cb-4c15-9480-f0966c234d04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.227 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[eaf4bf0e-90ef-4b22-a5aa-de25678f8f08]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 260052, 'reachable_time': 27725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211073, 'error': None, 'target': 'ovnmeta-6047b481-74d8-4106-8233-64be950f9819', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d6047b481\x2d74d8\x2d4106\x2d8233\x2d64be950f9819.mount: Deactivated successfully.
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.229 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6047b481-74d8-4106-8233-64be950f9819 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:46:09 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:09.229 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[ef828153-2883-45cc-85d3-a27d90160693]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.237 181991 DEBUG nova.virt.libvirt.vif [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1699210219',display_name='tempest-TestNetworkBasicOps-server-1699210219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1699210219',id=3,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa34emIksTW5XRTN2HjrF17Nrm01ikW/EcCvVD8j9em2yUPyxX5X112js/FIFFCkUdpCOqO4Zf2B4kJ5ygkQLAtD/X6vZ4bcCkTD4/RWf4dC8RvnasSQpYvRSqjYQ4YEQ==',key_name='tempest-TestNetworkBasicOps-1346800269',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:45:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-b9761sje',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:45:35Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=a62d7bac-49cb-4c15-9480-f0966c234d04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.237 181991 DEBUG nova.network.os_vif_util [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.237 181991 DEBUG nova.network.os_vif_util [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:66:08,bridge_name='br-int',has_traffic_filtering=True,id=8163da9a-c7bd-4211-9987-5fa00984c726,network=Network(6047b481-74d8-4106-8233-64be950f9819),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8163da9a-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.238 181991 DEBUG os_vif [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:66:08,bridge_name='br-int',has_traffic_filtering=True,id=8163da9a-c7bd-4211-9987-5fa00984c726,network=Network(6047b481-74d8-4106-8233-64be950f9819),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8163da9a-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.238 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.239 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8163da9a-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.240 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.241 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.242 181991 INFO os_vif [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:66:08,bridge_name='br-int',has_traffic_filtering=True,id=8163da9a-c7bd-4211-9987-5fa00984c726,network=Network(6047b481-74d8-4106-8233-64be950f9819),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8163da9a-c7')
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.243 181991 INFO nova.virt.libvirt.driver [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Deleting instance files /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04_del
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.243 181991 INFO nova.virt.libvirt.driver [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Deletion of /var/lib/nova/instances/a62d7bac-49cb-4c15-9480-f0966c234d04_del complete
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.279 181991 INFO nova.compute.manager [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Took 0.30 seconds to destroy the instance on the hypervisor.
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.279 181991 DEBUG oslo.service.loopingcall [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.279 181991 DEBUG nova.compute.manager [-] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.279 181991 DEBUG nova.network.neutron [-] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.561 181991 DEBUG nova.compute.manager [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-unplugged-8163da9a-c7bd-4211-9987-5fa00984c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.561 181991 DEBUG oslo_concurrency.lockutils [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.562 181991 DEBUG oslo_concurrency.lockutils [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.562 181991 DEBUG oslo_concurrency.lockutils [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.562 181991 DEBUG nova.compute.manager [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] No waiting events found dispatching network-vif-unplugged-8163da9a-c7bd-4211-9987-5fa00984c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.562 181991 DEBUG nova.compute.manager [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-unplugged-8163da9a-c7bd-4211-9987-5fa00984c726 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.562 181991 DEBUG nova.compute.manager [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-plugged-8163da9a-c7bd-4211-9987-5fa00984c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.562 181991 DEBUG oslo_concurrency.lockutils [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.563 181991 DEBUG oslo_concurrency.lockutils [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.563 181991 DEBUG oslo_concurrency.lockutils [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.563 181991 DEBUG nova.compute.manager [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] No waiting events found dispatching network-vif-plugged-8163da9a-c7bd-4211-9987-5fa00984c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.563 181991 WARNING nova.compute.manager [req-49bec366-ee1a-45da-829a-60cb2a5b98bc req-aad9bc55-e1a9-407f-9b4c-a008dc669c54 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received unexpected event network-vif-plugged-8163da9a-c7bd-4211-9987-5fa00984c726 for instance with vm_state active and task_state deleting.
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.649 181991 DEBUG nova.network.neutron [-] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.657 181991 INFO nova.compute.manager [-] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Took 0.38 seconds to deallocate network for instance.
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.686 181991 DEBUG oslo_concurrency.lockutils [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.686 181991 DEBUG oslo_concurrency.lockutils [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.726 181991 DEBUG nova.compute.provider_tree [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.736 181991 DEBUG nova.scheduler.client.report [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.753 181991 DEBUG oslo_concurrency.lockutils [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.769 181991 DEBUG nova.network.neutron [req-7b780061-8ae6-440d-8602-4fa2e4866f43 req-38501c31-f70f-45c6-9edd-87d24c9bbb3a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updated VIF entry in instance network info cache for port 8163da9a-c7bd-4211-9987-5fa00984c726. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.769 181991 DEBUG nova.network.neutron [req-7b780061-8ae6-440d-8602-4fa2e4866f43 req-38501c31-f70f-45c6-9edd-87d24c9bbb3a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Updating instance_info_cache with network_info: [{"id": "8163da9a-c7bd-4211-9987-5fa00984c726", "address": "fa:16:3e:db:66:08", "network": {"id": "6047b481-74d8-4106-8233-64be950f9819", "bridge": "br-int", "label": "tempest-network-smoke--1051495819", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8163da9a-c7", "ovs_interfaceid": "8163da9a-c7bd-4211-9987-5fa00984c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.773 181991 INFO nova.scheduler.client.report [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance a62d7bac-49cb-4c15-9480-f0966c234d04
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.787 181991 DEBUG oslo_concurrency.lockutils [req-7b780061-8ae6-440d-8602-4fa2e4866f43 req-38501c31-f70f-45c6-9edd-87d24c9bbb3a 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-a62d7bac-49cb-4c15-9480-f0966c234d04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:46:09 compute-0 nova_compute[181978]: 2026-01-12 13:46:09.825 181991 DEBUG oslo_concurrency.lockutils [None req-8b779664-6d04-4925-b6c3-62a5a21b89a9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "a62d7bac-49cb-4c15-9480-f0966c234d04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:10 compute-0 nova_compute[181978]: 2026-01-12 13:46:10.981 181991 DEBUG nova.compute.manager [req-cb59f6a7-aadd-464b-ab17-e7f09babec63 req-ba49a056-34ce-4817-8e03-b8461617331e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Received event network-vif-deleted-8163da9a-c7bd-4211-9987-5fa00984c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:10 compute-0 nova_compute[181978]: 2026-01-12 13:46:10.981 181991 INFO nova.compute.manager [req-cb59f6a7-aadd-464b-ab17-e7f09babec63 req-ba49a056-34ce-4817-8e03-b8461617331e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Neutron deleted interface 8163da9a-c7bd-4211-9987-5fa00984c726; detaching it from the instance and deleting it from the info cache
Jan 12 13:46:10 compute-0 nova_compute[181978]: 2026-01-12 13:46:10.981 181991 DEBUG nova.network.neutron [req-cb59f6a7-aadd-464b-ab17-e7f09babec63 req-ba49a056-34ce-4817-8e03-b8461617331e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 12 13:46:10 compute-0 nova_compute[181978]: 2026-01-12 13:46:10.983 181991 DEBUG nova.compute.manager [req-cb59f6a7-aadd-464b-ab17-e7f09babec63 req-ba49a056-34ce-4817-8e03-b8461617331e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Detach interface failed, port_id=8163da9a-c7bd-4211-9987-5fa00984c726, reason: Instance a62d7bac-49cb-4c15-9480-f0966c234d04 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 12 13:46:11 compute-0 podman[211078]: 2026-01-12 13:46:11.560828075 +0000 UTC m=+0.055035384 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 12 13:46:11 compute-0 podman[211101]: 2026-01-12 13:46:11.624339515 +0000 UTC m=+0.043308219 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 12 13:46:11 compute-0 podman[211102]: 2026-01-12 13:46:11.626616273 +0000 UTC m=+0.043565873 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 12 13:46:12 compute-0 nova_compute[181978]: 2026-01-12 13:46:12.168 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:13 compute-0 nova_compute[181978]: 2026-01-12 13:46:13.613 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:13 compute-0 nova_compute[181978]: 2026-01-12 13:46:13.686 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:14 compute-0 nova_compute[181978]: 2026-01-12 13:46:14.239 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:15 compute-0 nova_compute[181978]: 2026-01-12 13:46:15.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:15 compute-0 nova_compute[181978]: 2026-01-12 13:46:15.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:46:15 compute-0 nova_compute[181978]: 2026-01-12 13:46:15.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:46:15 compute-0 nova_compute[181978]: 2026-01-12 13:46:15.493 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:46:16 compute-0 nova_compute[181978]: 2026-01-12 13:46:16.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.170 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.510 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.510 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.510 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.511 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.705 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.706 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5748MB free_disk=73.3841781616211GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.706 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.706 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.757 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.757 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.773 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.782 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.795 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:46:17 compute-0 nova_compute[181978]: 2026-01-12 13:46:17.795 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:18 compute-0 nova_compute[181978]: 2026-01-12 13:46:18.791 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:18 compute-0 nova_compute[181978]: 2026-01-12 13:46:18.804 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:19 compute-0 nova_compute[181978]: 2026-01-12 13:46:19.240 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:19 compute-0 nova_compute[181978]: 2026-01-12 13:46:19.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:19 compute-0 nova_compute[181978]: 2026-01-12 13:46:19.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:19 compute-0 podman[211143]: 2026-01-12 13:46:19.543338497 +0000 UTC m=+0.037007981 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 12 13:46:20 compute-0 nova_compute[181978]: 2026-01-12 13:46:20.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:20 compute-0 nova_compute[181978]: 2026-01-12 13:46:20.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:46:20 compute-0 nova_compute[181978]: 2026-01-12 13:46:20.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:46:22 compute-0 nova_compute[181978]: 2026-01-12 13:46:22.171 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:24 compute-0 nova_compute[181978]: 2026-01-12 13:46:24.223 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225569.2233913, a62d7bac-49cb-4c15-9480-f0966c234d04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:46:24 compute-0 nova_compute[181978]: 2026-01-12 13:46:24.224 181991 INFO nova.compute.manager [-] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] VM Stopped (Lifecycle Event)
Jan 12 13:46:24 compute-0 nova_compute[181978]: 2026-01-12 13:46:24.241 181991 DEBUG nova.compute.manager [None req-5903f7aa-1143-4dbf-9f1f-d0e66a21560b - - - - - -] [instance: a62d7bac-49cb-4c15-9480-f0966c234d04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:46:24 compute-0 nova_compute[181978]: 2026-01-12 13:46:24.241 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.840 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.840 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.852 181991 DEBUG nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.900 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.901 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.905 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.905 181991 INFO nova.compute.claims [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.975 181991 DEBUG nova.compute.provider_tree [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.985 181991 DEBUG nova.scheduler.client.report [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.997 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:26 compute-0 nova_compute[181978]: 2026-01-12 13:46:26.997 181991 DEBUG nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.027 181991 DEBUG nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.027 181991 DEBUG nova.network.neutron [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.038 181991 INFO nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.049 181991 DEBUG nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.120 181991 DEBUG nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.121 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.122 181991 INFO nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Creating image(s)
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.122 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.122 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.123 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.133 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.173 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.176 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.177 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.177 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.187 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.214 181991 DEBUG nova.policy [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.230 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.231 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.250 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.251 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.251 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.296 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.297 181991 DEBUG nova.virt.disk.api [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.297 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.341 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.342 181991 DEBUG nova.virt.disk.api [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.342 181991 DEBUG nova.objects.instance [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid 38adfe16-dcdc-44a9-8c50-a051037d4bbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.357 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.357 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Ensure instance console log exists: /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.358 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.358 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.358 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:27 compute-0 nova_compute[181978]: 2026-01-12 13:46:27.744 181991 DEBUG nova.network.neutron [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Successfully created port: e7942da1-f887-4ac9-8a01-72673dab4bd2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.312 181991 DEBUG nova.network.neutron [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Successfully updated port: e7942da1-f887-4ac9-8a01-72673dab4bd2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.331 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.331 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.331 181991 DEBUG nova.network.neutron [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.396 181991 DEBUG nova.compute.manager [req-5da2b86d-5926-498c-b538-3c596a405648 req-d960fd03-663b-4a8c-a042-f289a8e5be56 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received event network-changed-e7942da1-f887-4ac9-8a01-72673dab4bd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.397 181991 DEBUG nova.compute.manager [req-5da2b86d-5926-498c-b538-3c596a405648 req-d960fd03-663b-4a8c-a042-f289a8e5be56 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Refreshing instance network info cache due to event network-changed-e7942da1-f887-4ac9-8a01-72673dab4bd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.397 181991 DEBUG oslo_concurrency.lockutils [req-5da2b86d-5926-498c-b538-3c596a405648 req-d960fd03-663b-4a8c-a042-f289a8e5be56 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.438 181991 DEBUG nova.network.neutron [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.888 181991 DEBUG nova.network.neutron [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updating instance_info_cache with network_info: [{"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.907 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.907 181991 DEBUG nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Instance network_info: |[{"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.907 181991 DEBUG oslo_concurrency.lockutils [req-5da2b86d-5926-498c-b538-3c596a405648 req-d960fd03-663b-4a8c-a042-f289a8e5be56 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.907 181991 DEBUG nova.network.neutron [req-5da2b86d-5926-498c-b538-3c596a405648 req-d960fd03-663b-4a8c-a042-f289a8e5be56 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Refreshing network info cache for port e7942da1-f887-4ac9-8a01-72673dab4bd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.909 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Start _get_guest_xml network_info=[{"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.913 181991 WARNING nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.918 181991 DEBUG nova.virt.libvirt.host [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.918 181991 DEBUG nova.virt.libvirt.host [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.923 181991 DEBUG nova.virt.libvirt.host [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.923 181991 DEBUG nova.virt.libvirt.host [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.923 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.923 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.924 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.924 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.924 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.924 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.925 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.925 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.925 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.925 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.925 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.926 181991 DEBUG nova.virt.hardware [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.928 181991 DEBUG nova.virt.libvirt.vif [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:46:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1047315553',display_name='tempest-TestNetworkBasicOps-server-1047315553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1047315553',id=4,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPYa+Z2CAObwJIdbzfDpkgludr9WfwFuPqwsPBX/YPYuQGJOjCo7MNviASGyDeZ3aFWFsbV/fgSZjrzS4BP5rPhMyLWt1gImaMrn3S/xdh0c/z6cvY2QHWueSS6KdTeIhw==',key_name='tempest-TestNetworkBasicOps-838323751',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-cxt50auh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:46:27Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=38adfe16-dcdc-44a9-8c50-a051037d4bbe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.928 181991 DEBUG nova.network.os_vif_util [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.929 181991 DEBUG nova.network.os_vif_util [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:55:fe,bridge_name='br-int',has_traffic_filtering=True,id=e7942da1-f887-4ac9-8a01-72673dab4bd2,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7942da1-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.930 181991 DEBUG nova.objects.instance [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid 38adfe16-dcdc-44a9-8c50-a051037d4bbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.941 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <uuid>38adfe16-dcdc-44a9-8c50-a051037d4bbe</uuid>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <name>instance-00000004</name>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-1047315553</nova:name>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:46:28</nova:creationTime>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:46:28 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:46:28 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:46:28 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:46:28 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:46:28 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:46:28 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:46:28 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:46:28 compute-0 nova_compute[181978]:         <nova:port uuid="e7942da1-f887-4ac9-8a01-72673dab4bd2">
Jan 12 13:46:28 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <system>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <entry name="serial">38adfe16-dcdc-44a9-8c50-a051037d4bbe</entry>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <entry name="uuid">38adfe16-dcdc-44a9-8c50-a051037d4bbe</entry>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     </system>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <os>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   </os>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <features>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   </features>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.config"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:c0:55:fe"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <target dev="tape7942da1-f8"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/console.log" append="off"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <video>
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     </video>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:46:28 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:46:28 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:46:28 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:46:28 compute-0 nova_compute[181978]: </domain>
Jan 12 13:46:28 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.942 181991 DEBUG nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Preparing to wait for external event network-vif-plugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.942 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.943 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.943 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.943 181991 DEBUG nova.virt.libvirt.vif [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:46:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1047315553',display_name='tempest-TestNetworkBasicOps-server-1047315553',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1047315553',id=4,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPYa+Z2CAObwJIdbzfDpkgludr9WfwFuPqwsPBX/YPYuQGJOjCo7MNviASGyDeZ3aFWFsbV/fgSZjrzS4BP5rPhMyLWt1gImaMrn3S/xdh0c/z6cvY2QHWueSS6KdTeIhw==',key_name='tempest-TestNetworkBasicOps-838323751',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-cxt50auh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:46:27Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=38adfe16-dcdc-44a9-8c50-a051037d4bbe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.943 181991 DEBUG nova.network.os_vif_util [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.944 181991 DEBUG nova.network.os_vif_util [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:55:fe,bridge_name='br-int',has_traffic_filtering=True,id=e7942da1-f887-4ac9-8a01-72673dab4bd2,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7942da1-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.944 181991 DEBUG os_vif [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:55:fe,bridge_name='br-int',has_traffic_filtering=True,id=e7942da1-f887-4ac9-8a01-72673dab4bd2,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7942da1-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.945 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.945 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.945 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.947 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.947 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7942da1-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.947 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7942da1-f8, col_values=(('external_ids', {'iface-id': 'e7942da1-f887-4ac9-8a01-72673dab4bd2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:55:fe', 'vm-uuid': '38adfe16-dcdc-44a9-8c50-a051037d4bbe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:28 compute-0 NetworkManager[55211]: <info>  [1768225588.9495] manager: (tape7942da1-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.951 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.953 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.954 181991 INFO os_vif [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:55:fe,bridge_name='br-int',has_traffic_filtering=True,id=e7942da1-f887-4ac9-8a01-72673dab4bd2,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7942da1-f8')
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.982 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.982 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.983 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:c0:55:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:46:28 compute-0 nova_compute[181978]: 2026-01-12 13:46:28.983 181991 INFO nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Using config drive
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.470 181991 INFO nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Creating config drive at /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.config
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.474 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp15soldiy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:29 compute-0 podman[211179]: 2026-01-12 13:46:29.564382801 +0000 UTC m=+0.060789497 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.589 181991 DEBUG oslo_concurrency.processutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp15soldiy" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:29 compute-0 kernel: tape7942da1-f8: entered promiscuous mode
Jan 12 13:46:29 compute-0 NetworkManager[55211]: <info>  [1768225589.6260] manager: (tape7942da1-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Jan 12 13:46:29 compute-0 ovn_controller[94974]: 2026-01-12T13:46:29Z|00074|binding|INFO|Claiming lport e7942da1-f887-4ac9-8a01-72673dab4bd2 for this chassis.
Jan 12 13:46:29 compute-0 ovn_controller[94974]: 2026-01-12T13:46:29Z|00075|binding|INFO|e7942da1-f887-4ac9-8a01-72673dab4bd2: Claiming fa:16:3e:c0:55:fe 10.100.0.5
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.627 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.630 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.635 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:55:fe 10.100.0.5'], port_security=['fa:16:3e:c0:55:fe 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b6c357cb-eeaf-4bbf-9955-c174b2707487', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd27fee5-c567-45b5-8a69-ab9b5802587e, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=e7942da1-f887-4ac9-8a01-72673dab4bd2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.636 104189 INFO neutron.agent.ovn.metadata.agent [-] Port e7942da1-f887-4ac9-8a01-72673dab4bd2 in datapath a2132c96-bfe0-4c64-a5b0-a3df61a88e5d bound to our chassis
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.637 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2132c96-bfe0-4c64-a5b0-a3df61a88e5d
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.646 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[42c9c321-ff5d-451f-9fe6-c4d7dcdd8585]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.646 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa2132c96-b1 in ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:46:29 compute-0 systemd-udevd[211216]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.647 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa2132c96-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.647 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[2592af1b-a114-440f-96a7-661dea4d3060]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.648 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[60b8ad21-533f-4464-a80c-d0f4d87549e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.657 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[85ab44f9-9488-4de0-b957-e2fa16d69504]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 NetworkManager[55211]: <info>  [1768225589.6609] device (tape7942da1-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:46:29 compute-0 NetworkManager[55211]: <info>  [1768225589.6615] device (tape7942da1-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:46:29 compute-0 systemd-machined[153581]: New machine qemu-4-instance-00000004.
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.685 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[611096c6-3cc4-4d52-9dbf-890e123f60a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.686 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:29 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Jan 12 13:46:29 compute-0 ovn_controller[94974]: 2026-01-12T13:46:29Z|00076|binding|INFO|Setting lport e7942da1-f887-4ac9-8a01-72673dab4bd2 ovn-installed in OVS
Jan 12 13:46:29 compute-0 ovn_controller[94974]: 2026-01-12T13:46:29Z|00077|binding|INFO|Setting lport e7942da1-f887-4ac9-8a01-72673dab4bd2 up in Southbound
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.693 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.707 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[6a747fa1-a9d5-425b-8cb2-8c6663c56035]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 NetworkManager[55211]: <info>  [1768225589.7124] manager: (tapa2132c96-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.713 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d0304c95-67a1-4a8b-af8b-1d15982aeddc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 systemd-udevd[211221]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.738 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[3dcd1624-1ddf-4156-8388-48117caa34b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.740 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[1a045ef3-816c-4fbc-8469-976fccbd0329]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 NetworkManager[55211]: <info>  [1768225589.7565] device (tapa2132c96-b0): carrier: link connected
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.760 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3d419d-31d3-4929-b071-8fa5d93848f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.771 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b07516-3ded-4dd3-8cb4-ac0768b5e405]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2132c96-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:c9:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 265578, 'reachable_time': 35963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211244, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.779 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d256de5e-fa6b-495c-a600-55a1d68d1f08]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9d:c959'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 265578, 'tstamp': 265578}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211245, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.788 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ac5cf8f1-c13b-4619-b7ff-5abcbc39ced7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2132c96-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:c9:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 265578, 'reachable_time': 35963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211246, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.807 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc8eb2d-02f5-4b84-a110-5d00111dd673]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.844 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[6700aa29-b955-462f-a93f-e04e84c1f68c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.845 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2132c96-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.845 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.846 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2132c96-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:29 compute-0 NetworkManager[55211]: <info>  [1768225589.8490] manager: (tapa2132c96-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 12 13:46:29 compute-0 kernel: tapa2132c96-b0: entered promiscuous mode
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.852 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.853 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2132c96-b0, col_values=(('external_ids', {'iface-id': '9c5265ef-1958-4964-ae02-09e78713440d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:29 compute-0 ovn_controller[94974]: 2026-01-12T13:46:29Z|00078|binding|INFO|Releasing lport 9c5265ef-1958-4964-ae02-09e78713440d from this chassis (sb_readonly=0)
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.856 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2132c96-bfe0-4c64-a5b0-a3df61a88e5d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2132c96-bfe0-4c64-a5b0-a3df61a88e5d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.857 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[bb97fa3e-79f6-49e6-b87a-e579b769e18b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.857 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/a2132c96-bfe0-4c64-a5b0-a3df61a88e5d.pid.haproxy
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID a2132c96-bfe0-4c64-a5b0-a3df61a88e5d
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:46:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:29.859 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'env', 'PROCESS_TAG=haproxy-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a2132c96-bfe0-4c64-a5b0-a3df61a88e5d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.866 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.963 181991 DEBUG nova.network.neutron [req-5da2b86d-5926-498c-b538-3c596a405648 req-d960fd03-663b-4a8c-a042-f289a8e5be56 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updated VIF entry in instance network info cache for port e7942da1-f887-4ac9-8a01-72673dab4bd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.964 181991 DEBUG nova.network.neutron [req-5da2b86d-5926-498c-b538-3c596a405648 req-d960fd03-663b-4a8c-a042-f289a8e5be56 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updating instance_info_cache with network_info: [{"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:29 compute-0 nova_compute[181978]: 2026-01-12 13:46:29.987 181991 DEBUG oslo_concurrency.lockutils [req-5da2b86d-5926-498c-b538-3c596a405648 req-d960fd03-663b-4a8c-a042-f289a8e5be56 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:46:30 compute-0 podman[211275]: 2026-01-12 13:46:30.145782379 +0000 UTC m=+0.031453002 container create a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 12 13:46:30 compute-0 systemd[1]: Started libpod-conmon-a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a.scope.
Jan 12 13:46:30 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:46:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b81441870910472b0a4ec2d2bc5f5e65c6fb651eb92740dc93c68fed778087cd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:46:30 compute-0 podman[211275]: 2026-01-12 13:46:30.2060798 +0000 UTC m=+0.091750423 container init a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:46:30 compute-0 podman[211275]: 2026-01-12 13:46:30.210107548 +0000 UTC m=+0.095778151 container start a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:46:30 compute-0 podman[211275]: 2026-01-12 13:46:30.131256905 +0000 UTC m=+0.016927538 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:46:30 compute-0 neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d[211287]: [NOTICE]   (211291) : New worker (211293) forked
Jan 12 13:46:30 compute-0 neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d[211287]: [NOTICE]   (211291) : Loading success.
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.468 181991 DEBUG nova.compute.manager [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received event network-vif-plugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.468 181991 DEBUG oslo_concurrency.lockutils [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.468 181991 DEBUG oslo_concurrency.lockutils [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.469 181991 DEBUG oslo_concurrency.lockutils [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.469 181991 DEBUG nova.compute.manager [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Processing event network-vif-plugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.469 181991 DEBUG nova.compute.manager [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received event network-vif-plugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.469 181991 DEBUG oslo_concurrency.lockutils [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.469 181991 DEBUG oslo_concurrency.lockutils [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.469 181991 DEBUG oslo_concurrency.lockutils [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.469 181991 DEBUG nova.compute.manager [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] No waiting events found dispatching network-vif-plugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.470 181991 WARNING nova.compute.manager [req-975f583c-628b-4ae8-896e-d46f82b5cebe req-6807c986-231e-4dfd-8137-fb9b5a818f58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received unexpected event network-vif-plugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 for instance with vm_state building and task_state spawning.
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.483 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225590.4833412, 38adfe16-dcdc-44a9-8c50-a051037d4bbe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.484 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] VM Started (Lifecycle Event)
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.485 181991 DEBUG nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.488 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.490 181991 INFO nova.virt.libvirt.driver [-] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Instance spawned successfully.
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.491 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.512 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.516 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.518 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.518 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.519 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.519 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.520 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.520 181991 DEBUG nova.virt.libvirt.driver [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.544 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.544 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225590.483559, 38adfe16-dcdc-44a9-8c50-a051037d4bbe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.544 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] VM Paused (Lifecycle Event)
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.566 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.568 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225590.487803, 38adfe16-dcdc-44a9-8c50-a051037d4bbe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.568 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] VM Resumed (Lifecycle Event)
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.574 181991 INFO nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Took 3.45 seconds to spawn the instance on the hypervisor.
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.574 181991 DEBUG nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.581 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.583 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.603 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.622 181991 INFO nova.compute.manager [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Took 3.74 seconds to build instance.
Jan 12 13:46:30 compute-0 nova_compute[181978]: 2026-01-12 13:46:30.638 181991 DEBUG oslo_concurrency.lockutils [None req-a2636109-a869-425c-bab3-4ab94b97bb1c d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:32 compute-0 nova_compute[181978]: 2026-01-12 13:46:32.174 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:32 compute-0 podman[211305]: 2026-01-12 13:46:32.55747486 +0000 UTC m=+0.049279079 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:46:33 compute-0 nova_compute[181978]: 2026-01-12 13:46:33.949 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:35 compute-0 nova_compute[181978]: 2026-01-12 13:46:35.582 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:35 compute-0 NetworkManager[55211]: <info>  [1768225595.5834] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 12 13:46:35 compute-0 NetworkManager[55211]: <info>  [1768225595.5842] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 12 13:46:35 compute-0 ovn_controller[94974]: 2026-01-12T13:46:35Z|00079|binding|INFO|Releasing lport 9c5265ef-1958-4964-ae02-09e78713440d from this chassis (sb_readonly=0)
Jan 12 13:46:35 compute-0 nova_compute[181978]: 2026-01-12 13:46:35.614 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:35 compute-0 ovn_controller[94974]: 2026-01-12T13:46:35Z|00080|binding|INFO|Releasing lport 9c5265ef-1958-4964-ae02-09e78713440d from this chassis (sb_readonly=0)
Jan 12 13:46:35 compute-0 nova_compute[181978]: 2026-01-12 13:46:35.617 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:36 compute-0 nova_compute[181978]: 2026-01-12 13:46:36.327 181991 DEBUG nova.compute.manager [req-24e2accf-e14c-435b-9faa-91025056bff3 req-4cea9568-a3cb-40e1-86f7-f6213ce87d98 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received event network-changed-e7942da1-f887-4ac9-8a01-72673dab4bd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:36 compute-0 nova_compute[181978]: 2026-01-12 13:46:36.327 181991 DEBUG nova.compute.manager [req-24e2accf-e14c-435b-9faa-91025056bff3 req-4cea9568-a3cb-40e1-86f7-f6213ce87d98 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Refreshing instance network info cache due to event network-changed-e7942da1-f887-4ac9-8a01-72673dab4bd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:46:36 compute-0 nova_compute[181978]: 2026-01-12 13:46:36.327 181991 DEBUG oslo_concurrency.lockutils [req-24e2accf-e14c-435b-9faa-91025056bff3 req-4cea9568-a3cb-40e1-86f7-f6213ce87d98 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:46:36 compute-0 nova_compute[181978]: 2026-01-12 13:46:36.328 181991 DEBUG oslo_concurrency.lockutils [req-24e2accf-e14c-435b-9faa-91025056bff3 req-4cea9568-a3cb-40e1-86f7-f6213ce87d98 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:46:36 compute-0 nova_compute[181978]: 2026-01-12 13:46:36.328 181991 DEBUG nova.network.neutron [req-24e2accf-e14c-435b-9faa-91025056bff3 req-4cea9568-a3cb-40e1-86f7-f6213ce87d98 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Refreshing network info cache for port e7942da1-f887-4ac9-8a01-72673dab4bd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:46:37 compute-0 nova_compute[181978]: 2026-01-12 13:46:37.176 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:37 compute-0 nova_compute[181978]: 2026-01-12 13:46:37.515 181991 DEBUG nova.network.neutron [req-24e2accf-e14c-435b-9faa-91025056bff3 req-4cea9568-a3cb-40e1-86f7-f6213ce87d98 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updated VIF entry in instance network info cache for port e7942da1-f887-4ac9-8a01-72673dab4bd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:46:37 compute-0 nova_compute[181978]: 2026-01-12 13:46:37.516 181991 DEBUG nova.network.neutron [req-24e2accf-e14c-435b-9faa-91025056bff3 req-4cea9568-a3cb-40e1-86f7-f6213ce87d98 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updating instance_info_cache with network_info: [{"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:37 compute-0 nova_compute[181978]: 2026-01-12 13:46:37.532 181991 DEBUG oslo_concurrency.lockutils [req-24e2accf-e14c-435b-9faa-91025056bff3 req-4cea9568-a3cb-40e1-86f7-f6213ce87d98 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:46:38 compute-0 nova_compute[181978]: 2026-01-12 13:46:38.951 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:40.199 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:40.201 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:40.201 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:41 compute-0 ovn_controller[94974]: 2026-01-12T13:46:41Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:55:fe 10.100.0.5
Jan 12 13:46:41 compute-0 ovn_controller[94974]: 2026-01-12T13:46:41Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:55:fe 10.100.0.5
Jan 12 13:46:42 compute-0 nova_compute[181978]: 2026-01-12 13:46:42.176 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:42 compute-0 podman[211337]: 2026-01-12 13:46:42.557438826 +0000 UTC m=+0.046904149 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:46:42 compute-0 podman[211338]: 2026-01-12 13:46:42.565907742 +0000 UTC m=+0.053196074 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 12 13:46:42 compute-0 podman[211336]: 2026-01-12 13:46:42.599468954 +0000 UTC m=+0.090247315 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:46:43 compute-0 nova_compute[181978]: 2026-01-12 13:46:43.952 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:47 compute-0 nova_compute[181978]: 2026-01-12 13:46:47.179 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:48 compute-0 nova_compute[181978]: 2026-01-12 13:46:48.614 181991 INFO nova.compute.manager [None req-244c4ee8-ae18-46ce-944c-34d5595af211 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Get console output
Jan 12 13:46:48 compute-0 nova_compute[181978]: 2026-01-12 13:46:48.617 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:46:48 compute-0 nova_compute[181978]: 2026-01-12 13:46:48.953 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:49 compute-0 nova_compute[181978]: 2026-01-12 13:46:49.511 181991 DEBUG nova.compute.manager [req-287032f7-4910-4c04-bae0-87fe95c37e4e req-0046ee3d-c5ec-466e-b3bd-31d3a7fb12e3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received event network-changed-e7942da1-f887-4ac9-8a01-72673dab4bd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:49 compute-0 nova_compute[181978]: 2026-01-12 13:46:49.511 181991 DEBUG nova.compute.manager [req-287032f7-4910-4c04-bae0-87fe95c37e4e req-0046ee3d-c5ec-466e-b3bd-31d3a7fb12e3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Refreshing instance network info cache due to event network-changed-e7942da1-f887-4ac9-8a01-72673dab4bd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:46:49 compute-0 nova_compute[181978]: 2026-01-12 13:46:49.512 181991 DEBUG oslo_concurrency.lockutils [req-287032f7-4910-4c04-bae0-87fe95c37e4e req-0046ee3d-c5ec-466e-b3bd-31d3a7fb12e3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:46:49 compute-0 nova_compute[181978]: 2026-01-12 13:46:49.512 181991 DEBUG oslo_concurrency.lockutils [req-287032f7-4910-4c04-bae0-87fe95c37e4e req-0046ee3d-c5ec-466e-b3bd-31d3a7fb12e3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:46:49 compute-0 nova_compute[181978]: 2026-01-12 13:46:49.512 181991 DEBUG nova.network.neutron [req-287032f7-4910-4c04-bae0-87fe95c37e4e req-0046ee3d-c5ec-466e-b3bd-31d3a7fb12e3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Refreshing network info cache for port e7942da1-f887-4ac9-8a01-72673dab4bd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:46:50 compute-0 podman[211398]: 2026-01-12 13:46:50.551679841 +0000 UTC m=+0.040526784 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:46:50 compute-0 nova_compute[181978]: 2026-01-12 13:46:50.803 181991 DEBUG nova.network.neutron [req-287032f7-4910-4c04-bae0-87fe95c37e4e req-0046ee3d-c5ec-466e-b3bd-31d3a7fb12e3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updated VIF entry in instance network info cache for port e7942da1-f887-4ac9-8a01-72673dab4bd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:46:50 compute-0 nova_compute[181978]: 2026-01-12 13:46:50.803 181991 DEBUG nova.network.neutron [req-287032f7-4910-4c04-bae0-87fe95c37e4e req-0046ee3d-c5ec-466e-b3bd-31d3a7fb12e3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updating instance_info_cache with network_info: [{"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:50 compute-0 nova_compute[181978]: 2026-01-12 13:46:50.820 181991 DEBUG oslo_concurrency.lockutils [req-287032f7-4910-4c04-bae0-87fe95c37e4e req-0046ee3d-c5ec-466e-b3bd-31d3a7fb12e3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:46:52 compute-0 nova_compute[181978]: 2026-01-12 13:46:52.181 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:53 compute-0 nova_compute[181978]: 2026-01-12 13:46:53.953 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.278 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "0f7f449a-6fda-4afe-93ed-05763a43015b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.280 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.299 181991 DEBUG nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.350 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.350 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.355 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.355 181991 INFO nova.compute.claims [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.432 181991 DEBUG nova.compute.provider_tree [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.443 181991 DEBUG nova.scheduler.client.report [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.457 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.457 181991 DEBUG nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.497 181991 DEBUG nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.497 181991 DEBUG nova.network.neutron [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.510 181991 INFO nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.520 181991 DEBUG nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.569 181991 DEBUG nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.569 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.570 181991 INFO nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Creating image(s)
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.570 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.570 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.571 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.580 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.625 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.625 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.626 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.635 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.678 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.679 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.699 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.700 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.700 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.744 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.745 181991 DEBUG nova.virt.disk.api [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.745 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.789 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.789 181991 DEBUG nova.virt.disk.api [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.790 181991 DEBUG nova.objects.instance [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid 0f7f449a-6fda-4afe-93ed-05763a43015b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.803 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.803 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Ensure instance console log exists: /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.803 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.803 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:54 compute-0 nova_compute[181978]: 2026-01-12 13:46:54.804 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.123 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'name': 'tempest-TestNetworkBasicOps-server-1047315553', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c978298f864c4039b47e09202eaf780c', 'user_id': 'd4158a3958504a578730a6b3561138ce', 'hostId': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.142 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.read.bytes volume: 29514240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.142 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edcf53bf-f793-49c8-b24e-e2a45979d3ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29514240, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-vda', 'timestamp': '2026-01-12T13:46:55.124893', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '283cf9f2-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': '20b537352123886b54cf506b264fa43af4ca595ac48fdf2af8e761b8e9383c64'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-sda', 'timestamp': '2026-01-12T13:46:55.124893', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '283d07f8-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': 'dd0e33dc13fff2e3b4f9da679d6cb3552b9711315f1d7658f41356e5f32cb4ab'}]}, 'timestamp': '2026-01-12 13:46:55.143225', '_unique_id': '83628a302db44b89a2e340bee3ed745e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.144 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.145 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.152 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.152 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7db3bae9-2065-4c87-b4cf-374196a59cb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-vda', 'timestamp': '2026-01-12T13:46:55.145844', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '283e7f02-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.213032709, 'message_signature': '95577652d1ceaabf6dc7dabec0455efac8a53576754555279fdc8138bcb23a39'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-sda', 'timestamp': '2026-01-12T13:46:55.145844', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '283e886c-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.213032709, 'message_signature': '9f00cd1d7658f4534c348d09b53444cd5c22f28f9df33364b45e939cdfbe3ae3'}]}, 'timestamp': '2026-01-12 13:46:55.153053', '_unique_id': '3929505736e14a148ee1fd383dd759fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.153 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.155 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 38adfe16-dcdc-44a9-8c50-a051037d4bbe / tape7942da1-f8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.155 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2276cfc7-7fbf-446a-99b6-41c0de42ef9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.154221', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '283ef798-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': '0318b9bfa1120d1602c511a021ad8545354ca2bd2fac2be5b0d59ff16d758d46'}]}, 'timestamp': '2026-01-12 13:46:55.155929', '_unique_id': 'e06b6b3f732f4068bc422f4321f83fa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f4a013f-dbb3-4032-a097-480d2829ecc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.156987', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '283f2a9c-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': '475959eb8c1d57a109cec0d1bb0b897d27c1d8f2498ef529796febcb488d01a1'}]}, 'timestamp': '2026-01-12 13:46:55.157213', '_unique_id': '1f14422537a04083929e7872f190b847'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.157 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.158 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.write.bytes volume: 72974336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.158 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98dd7aef-1a9a-4207-9824-441e12d4d963', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72974336, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-vda', 'timestamp': '2026-01-12T13:46:55.158237', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '283f5b2a-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': '39f51aa0de24b2fbedba240365e5074e1f167bd23327da5786c94ab575648725'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-sda', 'timestamp': '2026-01-12T13:46:55.158237', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '283f6390-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': '56d5e7f8da50599a0421da01e3d0ac7c552e11f486ea83ee67c81ec9a5c4984d'}]}, 'timestamp': '2026-01-12 13:46:55.158658', '_unique_id': 'a82eaf4c510a4e318840fc640b25d4c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.159 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '364fd761-c855-4521-8b71-c6f6783ffef0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-vda', 'timestamp': '2026-01-12T13:46:55.159855', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '283f9b80-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.213032709, 'message_signature': '446350dd23aa89c54b4626cc7b8e0c919bf28ae731b86192d12489e93832a22d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-sda', 'timestamp': '2026-01-12T13:46:55.159855', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '283fa36e-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.213032709, 'message_signature': '17ba7b0364b62033dba799ded9a106d7d0fbfed6092d6b8ea4262a3f23aee3ab'}]}, 'timestamp': '2026-01-12 13:46:55.160294', '_unique_id': '1652d0fd2f9844748181b86c32271534'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.160 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.161 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.outgoing.bytes volume: 15952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec42072d-ec2a-4320-847f-ccc15cb5cfd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15952, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.161329', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '283fd406-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': 'bd0c116905d5a69d5f6376b9105d71afda08fa5d85b04e7261726c125665a487'}]}, 'timestamp': '2026-01-12 13:46:55.161561', '_unique_id': 'c3f4613a1a5544678286ffaa84548790'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.write.latency volume: 330231798 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.162 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30ddfb14-74ef-48d9-bf9d-77cf7eee8efa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 330231798, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-vda', 'timestamp': '2026-01-12T13:46:55.162575', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2840049e-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': 'deb478fd5247fe7d93804059b06336f071f512b1e23e81ff68ee23861ea9a2cc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-sda', 'timestamp': '2026-01-12T13:46:55.162575', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '28400d40-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': '54ebcf06a59860fe9ca9b76f60e0a7f69353bc39a32291330cb593dcc07314a4'}]}, 'timestamp': '2026-01-12 13:46:55.163002', '_unique_id': '48feeba377544baaa7047f0541aeda5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.incoming.bytes volume: 19058 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60573e59-4ced-4376-8263-45ecc521b8e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19058, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.164036', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '28403dc4-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': 'd019154ee6eb7045be0ae4b2044dd3ab165324ee19f507382fef851d77e47930'}]}, 'timestamp': '2026-01-12 13:46:55.164256', '_unique_id': '10b8e8168d4f465da01d2fb69acacf05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.164 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.165 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.read.requests volume: 1058 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.165 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee06cbb2-5fb2-4098-a682-f9a09b1510ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1058, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-vda', 'timestamp': '2026-01-12T13:46:55.165264', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '28406d94-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': '9ebe05bda8dc46acde5886b1e89dd2016fcee77b449a1f17adc14764194ef2a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-sda', 'timestamp': '2026-01-12T13:46:55.165264', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '284075c8-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': 'e91c72d79f07e6ce0e447a911ed412aca748c8dce9d7a16277263e64c2331007'}]}, 'timestamp': '2026-01-12 13:46:55.165678', '_unique_id': '165c980f58ee483cbec5ae913fbd0daa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.166 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52bf4483-ffae-404b-9ec6-55879da1b37a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-vda', 'timestamp': '2026-01-12T13:46:55.166702', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2840a61a-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.213032709, 'message_signature': 'a88420cade0da046689a5af2e115a80f0c50b3354aeb79ad05a67b8e67506325'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-sda', 'timestamp': '2026-01-12T13:46:55.166702', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2840aeda-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.213032709, 'message_signature': 'da9af17a0e306a9f1334b881d5cbbbbb6aa18b9c7769310f4d0f8b17042a20b1'}]}, 'timestamp': '2026-01-12 13:46:55.167140', '_unique_id': '820664b7fc4e44d98c8751f29d3133c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.167 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.168 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.178 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/memory.usage volume: 42.6640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3bea2cd-82f4-44d8-a843-ed6682490585', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.6640625, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'timestamp': '2026-01-12T13:46:55.168176', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '284273be-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.24561941, 'message_signature': '4e9b4439108d489c3eeb37c81c62ed2e43cd686d1a5f7c54d960a42defa68ef7'}]}, 'timestamp': '2026-01-12 13:46:55.178743', '_unique_id': '694f6c283ca141488f52cfe407716200'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.179 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.write.requests volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35c5408e-ef45-4192-b774-53146bbb64fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 320, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-vda', 'timestamp': '2026-01-12T13:46:55.179778', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2842a546-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': '9c773b9c8d2223ecad91a21c7293f88657cc18c101c3bed62ce9b9c335357daf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-sda', 'timestamp': '2026-01-12T13:46:55.179778', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2842ad66-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': '111aba6b64c63c4e74ff593cc3b51be6b12c5b0cd4a984dec48e5477fb2d6896'}]}, 'timestamp': '2026-01-12 13:46:55.180209', '_unique_id': '632d9eda422d40b4816a4aa3a9842ef3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.180 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.incoming.packets volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5714f40e-7c59-4b78-9779-3eb6bcba9caa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 103, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.181240', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '2842ddc2-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': 'b252463c9ef792b52bfed9cbd29bf94668d3885ccbfc27985100536d6119716e'}]}, 'timestamp': '2026-01-12 13:46:55.181460', '_unique_id': 'da79425275e247da91b84d82fdaeba5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.181 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.182 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.182 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1047315553>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1047315553>]
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.182 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.182 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/cpu volume: 9620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a44cf1ce-ebd1-4439-94c7-497946712bc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9620000000, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'timestamp': '2026-01-12T13:46:55.182786', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '28431ae4-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.24561941, 'message_signature': 'e800fc042de42b8e3238a94e75b89d82472f1051d0cf2c8a6e719be2f38ad13f'}]}, 'timestamp': '2026-01-12 13:46:55.183020', '_unique_id': 'c997efd7827643ad8c3d0ea4d95095f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.184 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.184 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1047315553>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1047315553>]
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.184 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.184 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1047315553>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1047315553>]
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.184 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea2521ee-657b-436a-ac95-f8e37549c2d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.184597', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '28436102-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': '09a70784b8e4e5e6de25d3d135c17155c400bae5af6a1fa4c8401152edafa584'}]}, 'timestamp': '2026-01-12 13:46:55.184821', '_unique_id': '5de10a92b2214166b08b792534e89055'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.185 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1047315553>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1047315553>]
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.outgoing.packets volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98ed294b-3576-49af-b555-151020848e8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 108, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.186127', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '28439cc6-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': 'efeeb4643d8e2a9e9358709d423dcd6be56daaf830adf619576736df1cfb51a3'}]}, 'timestamp': '2026-01-12 13:46:55.186351', '_unique_id': 'c2ec072530b34ad49bbfde272f685a21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.186 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.187 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.read.latency volume: 171671668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.187 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk.device.read.latency volume: 17848587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ed0bb3c-1370-4ef5-b94b-836f4df091e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 171671668, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-vda', 'timestamp': '2026-01-12T13:46:55.187358', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2843cc96-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': 'ac1a8e6eb75da76862dbc0c8a57a568ee533ea6adad8e7ffbcfe0f47ec7d4c20'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17848587, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe-sda', 'timestamp': '2026-01-12T13:46:55.187358', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'instance-00000004', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2843d4e8-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.192143901, 'message_signature': '7e317882b805104aad6bd12c19f4285fbf92b39d1535211013b2d51263d0b68a'}]}, 'timestamp': '2026-01-12 13:46:55.187775', '_unique_id': 'a3aca1498ae042b9a5d8e2574e07490e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.188 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17124554-a44f-4d5a-b39f-9492994a1ef9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.188799', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '284405c6-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': '81ce555f19337076a781f39d466c3dbe93eb9e679f6c8e67a82f03ef2207388f'}]}, 'timestamp': '2026-01-12 13:46:55.189037', '_unique_id': '5f5906d0573649dab597985d447d7668'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fee5c7fe-841f-40ef-8b7e-5787ed64b71e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.190034', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '28443550-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': '8399f31d4ae882578a3538ff14861038a5aa7f8bb29430dca9e6916c7598eff8'}]}, 'timestamp': '2026-01-12 13:46:55.190254', '_unique_id': '3ead995f7dcd4114a07e66ec910e34cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.190 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.191 12 DEBUG ceilometer.compute.pollsters [-] 38adfe16-dcdc-44a9-8c50-a051037d4bbe/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1be2148b-bea3-4350-ab79-651a82f4c70c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-00000004-38adfe16-dcdc-44a9-8c50-a051037d4bbe-tape7942da1-f8', 'timestamp': '2026-01-12T13:46:55.191296', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1047315553', 'name': 'tape7942da1-f8', 'instance_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:55:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7942da1-f8'}, 'message_id': '28446692-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2681.221391047, 'message_signature': '29ce4b7dd26343846a33c4d6e8a70787d74a989b09ac744a47368a0f0cc8dced'}]}, 'timestamp': '2026-01-12 13:46:55.191525', '_unique_id': 'c29568139ef5453ea7b8bedad47dbfc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:46:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:46:55.192 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:46:55 compute-0 nova_compute[181978]: 2026-01-12 13:46:55.217 181991 DEBUG nova.policy [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:46:56 compute-0 nova_compute[181978]: 2026-01-12 13:46:56.322 181991 DEBUG nova.network.neutron [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Successfully created port: 4291c6e9-3942-4177-a9d7-9de0f6c79518 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:46:57 compute-0 nova_compute[181978]: 2026-01-12 13:46:57.184 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:57 compute-0 nova_compute[181978]: 2026-01-12 13:46:57.399 181991 DEBUG nova.network.neutron [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Successfully updated port: 4291c6e9-3942-4177-a9d7-9de0f6c79518 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:46:57 compute-0 nova_compute[181978]: 2026-01-12 13:46:57.410 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-0f7f449a-6fda-4afe-93ed-05763a43015b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:46:57 compute-0 nova_compute[181978]: 2026-01-12 13:46:57.410 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-0f7f449a-6fda-4afe-93ed-05763a43015b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:46:57 compute-0 nova_compute[181978]: 2026-01-12 13:46:57.410 181991 DEBUG nova.network.neutron [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:46:57 compute-0 nova_compute[181978]: 2026-01-12 13:46:57.464 181991 DEBUG nova.compute.manager [req-2eaef461-ddcc-41a7-8d71-1bdeece82e6b req-b5f4f5e4-e520-41de-ad66-c0644ab38b70 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received event network-changed-4291c6e9-3942-4177-a9d7-9de0f6c79518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:57 compute-0 nova_compute[181978]: 2026-01-12 13:46:57.465 181991 DEBUG nova.compute.manager [req-2eaef461-ddcc-41a7-8d71-1bdeece82e6b req-b5f4f5e4-e520-41de-ad66-c0644ab38b70 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Refreshing instance network info cache due to event network-changed-4291c6e9-3942-4177-a9d7-9de0f6c79518. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:46:57 compute-0 nova_compute[181978]: 2026-01-12 13:46:57.465 181991 DEBUG oslo_concurrency.lockutils [req-2eaef461-ddcc-41a7-8d71-1bdeece82e6b req-b5f4f5e4-e520-41de-ad66-c0644ab38b70 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-0f7f449a-6fda-4afe-93ed-05763a43015b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:46:57 compute-0 nova_compute[181978]: 2026-01-12 13:46:57.504 181991 DEBUG nova.network.neutron [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.555 181991 DEBUG nova.network.neutron [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Updating instance_info_cache with network_info: [{"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.576 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-0f7f449a-6fda-4afe-93ed-05763a43015b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.576 181991 DEBUG nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Instance network_info: |[{"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.576 181991 DEBUG oslo_concurrency.lockutils [req-2eaef461-ddcc-41a7-8d71-1bdeece82e6b req-b5f4f5e4-e520-41de-ad66-c0644ab38b70 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-0f7f449a-6fda-4afe-93ed-05763a43015b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.577 181991 DEBUG nova.network.neutron [req-2eaef461-ddcc-41a7-8d71-1bdeece82e6b req-b5f4f5e4-e520-41de-ad66-c0644ab38b70 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Refreshing network info cache for port 4291c6e9-3942-4177-a9d7-9de0f6c79518 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.578 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Start _get_guest_xml network_info=[{"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.583 181991 WARNING nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.591 181991 DEBUG nova.virt.libvirt.host [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.591 181991 DEBUG nova.virt.libvirt.host [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.593 181991 DEBUG nova.virt.libvirt.host [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.593 181991 DEBUG nova.virt.libvirt.host [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.594 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.594 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.594 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.595 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.595 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.595 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.595 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.595 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.595 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.596 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.596 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.596 181991 DEBUG nova.virt.hardware [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.599 181991 DEBUG nova.virt.libvirt.vif [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:46:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-800081421',display_name='tempest-TestNetworkBasicOps-server-800081421',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-800081421',id=5,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIwjgUkzRWqYp8kk/KWyGtNFpJx7oY5uCKWdb7DBCPlGn05COvvPPwzIUA2o7WoO0vRwWCsSXSn+C0xoFKzoVekooyh9mCtLOB7x2srQsYV2X+b4fQ8XBjk9cCoTl7p20w==',key_name='tempest-TestNetworkBasicOps-802423183',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-qltmjv9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:46:54Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=0f7f449a-6fda-4afe-93ed-05763a43015b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.599 181991 DEBUG nova.network.os_vif_util [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.599 181991 DEBUG nova.network.os_vif_util [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:42:5a,bridge_name='br-int',has_traffic_filtering=True,id=4291c6e9-3942-4177-a9d7-9de0f6c79518,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4291c6e9-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.600 181991 DEBUG nova.objects.instance [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid 0f7f449a-6fda-4afe-93ed-05763a43015b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.609 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <uuid>0f7f449a-6fda-4afe-93ed-05763a43015b</uuid>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <name>instance-00000005</name>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-800081421</nova:name>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:46:58</nova:creationTime>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:46:58 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:46:58 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:46:58 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:46:58 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:46:58 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:46:58 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:46:58 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:46:58 compute-0 nova_compute[181978]:         <nova:port uuid="4291c6e9-3942-4177-a9d7-9de0f6c79518">
Jan 12 13:46:58 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <system>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <entry name="serial">0f7f449a-6fda-4afe-93ed-05763a43015b</entry>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <entry name="uuid">0f7f449a-6fda-4afe-93ed-05763a43015b</entry>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     </system>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <os>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   </os>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <features>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   </features>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk.config"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:06:42:5a"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <target dev="tap4291c6e9-39"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/console.log" append="off"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <video>
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     </video>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:46:58 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:46:58 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:46:58 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:46:58 compute-0 nova_compute[181978]: </domain>
Jan 12 13:46:58 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.610 181991 DEBUG nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Preparing to wait for external event network-vif-plugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.610 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.610 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.611 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.611 181991 DEBUG nova.virt.libvirt.vif [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:46:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-800081421',display_name='tempest-TestNetworkBasicOps-server-800081421',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-800081421',id=5,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIwjgUkzRWqYp8kk/KWyGtNFpJx7oY5uCKWdb7DBCPlGn05COvvPPwzIUA2o7WoO0vRwWCsSXSn+C0xoFKzoVekooyh9mCtLOB7x2srQsYV2X+b4fQ8XBjk9cCoTl7p20w==',key_name='tempest-TestNetworkBasicOps-802423183',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-qltmjv9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:46:54Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=0f7f449a-6fda-4afe-93ed-05763a43015b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.611 181991 DEBUG nova.network.os_vif_util [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.612 181991 DEBUG nova.network.os_vif_util [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:42:5a,bridge_name='br-int',has_traffic_filtering=True,id=4291c6e9-3942-4177-a9d7-9de0f6c79518,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4291c6e9-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.612 181991 DEBUG os_vif [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:42:5a,bridge_name='br-int',has_traffic_filtering=True,id=4291c6e9-3942-4177-a9d7-9de0f6c79518,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4291c6e9-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.612 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.613 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.613 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.615 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.615 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4291c6e9-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.615 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4291c6e9-39, col_values=(('external_ids', {'iface-id': '4291c6e9-3942-4177-a9d7-9de0f6c79518', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:42:5a', 'vm-uuid': '0f7f449a-6fda-4afe-93ed-05763a43015b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.616 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:58 compute-0 NetworkManager[55211]: <info>  [1768225618.6171] manager: (tap4291c6e9-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.619 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.621 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.622 181991 INFO os_vif [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:42:5a,bridge_name='br-int',has_traffic_filtering=True,id=4291c6e9-3942-4177-a9d7-9de0f6c79518,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4291c6e9-39')
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.651 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.651 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.651 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:06:42:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:46:58 compute-0 nova_compute[181978]: 2026-01-12 13:46:58.652 181991 INFO nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Using config drive
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.309 181991 INFO nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Creating config drive at /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk.config
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.314 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3bklz0z6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.431 181991 DEBUG oslo_concurrency.processutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3bklz0z6" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:46:59 compute-0 kernel: tap4291c6e9-39: entered promiscuous mode
Jan 12 13:46:59 compute-0 NetworkManager[55211]: <info>  [1768225619.4644] manager: (tap4291c6e9-39): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Jan 12 13:46:59 compute-0 ovn_controller[94974]: 2026-01-12T13:46:59Z|00081|binding|INFO|Claiming lport 4291c6e9-3942-4177-a9d7-9de0f6c79518 for this chassis.
Jan 12 13:46:59 compute-0 ovn_controller[94974]: 2026-01-12T13:46:59Z|00082|binding|INFO|4291c6e9-3942-4177-a9d7-9de0f6c79518: Claiming fa:16:3e:06:42:5a 10.100.0.10
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.467 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.470 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:42:5a 10.100.0.10'], port_security=['fa:16:3e:06:42:5a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '0f7f449a-6fda-4afe-93ed-05763a43015b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '33762d47-b42a-4231-ac6f-7bb9f318bc57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd27fee5-c567-45b5-8a69-ab9b5802587e, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=4291c6e9-3942-4177-a9d7-9de0f6c79518) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.471 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 4291c6e9-3942-4177-a9d7-9de0f6c79518 in datapath a2132c96-bfe0-4c64-a5b0-a3df61a88e5d bound to our chassis
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.472 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2132c96-bfe0-4c64-a5b0-a3df61a88e5d
Jan 12 13:46:59 compute-0 ovn_controller[94974]: 2026-01-12T13:46:59Z|00083|binding|INFO|Setting lport 4291c6e9-3942-4177-a9d7-9de0f6c79518 ovn-installed in OVS
Jan 12 13:46:59 compute-0 ovn_controller[94974]: 2026-01-12T13:46:59Z|00084|binding|INFO|Setting lport 4291c6e9-3942-4177-a9d7-9de0f6c79518 up in Southbound
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.479 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.481 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.483 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[b66276f2-d3ae-4f3d-8c90-25d1d27817dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:59 compute-0 systemd-machined[153581]: New machine qemu-5-instance-00000005.
Jan 12 13:46:59 compute-0 systemd-udevd[211452]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:46:59 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.502 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0b92c4-ab54-4d60-a267-4fdc4ad63bdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:59 compute-0 NetworkManager[55211]: <info>  [1768225619.5065] device (tap4291c6e9-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:46:59 compute-0 NetworkManager[55211]: <info>  [1768225619.5072] device (tap4291c6e9-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.505 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[33a49e5b-3d75-48dd-9936-d2ec306bba17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.523 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[9059cc97-d8fe-4c44-8861-06d029874411]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.535 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[9bec64f6-fd4f-4bdc-a107-22bfba6f8acb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2132c96-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:c9:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 265578, 'reachable_time': 35963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211459, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.545 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0d7492-431f-41c4-a93b-c6f9ef90c67e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa2132c96-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 265585, 'tstamp': 265585}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211463, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa2132c96-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 265586, 'tstamp': 265586}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211463, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.546 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2132c96-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.547 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.548 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.549 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2132c96-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.549 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.549 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2132c96-b0, col_values=(('external_ids', {'iface-id': '9c5265ef-1958-4964-ae02-09e78713440d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:46:59 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:46:59.550 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.681 181991 DEBUG nova.compute.manager [req-9eb29916-df1f-4e86-9613-74c778b91ec9 req-ae0994e2-4771-4737-8cd4-b4568d21d05b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received event network-vif-plugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.682 181991 DEBUG oslo_concurrency.lockutils [req-9eb29916-df1f-4e86-9613-74c778b91ec9 req-ae0994e2-4771-4737-8cd4-b4568d21d05b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.682 181991 DEBUG oslo_concurrency.lockutils [req-9eb29916-df1f-4e86-9613-74c778b91ec9 req-ae0994e2-4771-4737-8cd4-b4568d21d05b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.682 181991 DEBUG oslo_concurrency.lockutils [req-9eb29916-df1f-4e86-9613-74c778b91ec9 req-ae0994e2-4771-4737-8cd4-b4568d21d05b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.682 181991 DEBUG nova.compute.manager [req-9eb29916-df1f-4e86-9613-74c778b91ec9 req-ae0994e2-4771-4737-8cd4-b4568d21d05b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Processing event network-vif-plugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.694 181991 DEBUG nova.network.neutron [req-2eaef461-ddcc-41a7-8d71-1bdeece82e6b req-b5f4f5e4-e520-41de-ad66-c0644ab38b70 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Updated VIF entry in instance network info cache for port 4291c6e9-3942-4177-a9d7-9de0f6c79518. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.695 181991 DEBUG nova.network.neutron [req-2eaef461-ddcc-41a7-8d71-1bdeece82e6b req-b5f4f5e4-e520-41de-ad66-c0644ab38b70 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Updating instance_info_cache with network_info: [{"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:46:59 compute-0 nova_compute[181978]: 2026-01-12 13:46:59.709 181991 DEBUG oslo_concurrency.lockutils [req-2eaef461-ddcc-41a7-8d71-1bdeece82e6b req-b5f4f5e4-e520-41de-ad66-c0644ab38b70 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-0f7f449a-6fda-4afe-93ed-05763a43015b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.289 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225620.289285, 0f7f449a-6fda-4afe-93ed-05763a43015b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.290 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] VM Started (Lifecycle Event)
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.291 181991 DEBUG nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.295 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.297 181991 INFO nova.virt.libvirt.driver [-] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Instance spawned successfully.
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.297 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.308 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.312 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.314 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.314 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.315 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.315 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.315 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.316 181991 DEBUG nova.virt.libvirt.driver [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.339 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.339 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225620.2898118, 0f7f449a-6fda-4afe-93ed-05763a43015b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.340 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] VM Paused (Lifecycle Event)
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.362 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.364 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225620.293174, 0f7f449a-6fda-4afe-93ed-05763a43015b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.364 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] VM Resumed (Lifecycle Event)
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.370 181991 INFO nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Took 5.80 seconds to spawn the instance on the hypervisor.
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.370 181991 DEBUG nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.375 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.377 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.391 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.412 181991 INFO nova.compute.manager [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Took 6.08 seconds to build instance.
Jan 12 13:47:00 compute-0 nova_compute[181978]: 2026-01-12 13:47:00.422 181991 DEBUG oslo_concurrency.lockutils [None req-52f031a4-f5e1-4177-b904-d54e737a3687 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:00 compute-0 podman[211472]: 2026-01-12 13:47:00.555135516 +0000 UTC m=+0.041782394 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:47:01 compute-0 nova_compute[181978]: 2026-01-12 13:47:01.739 181991 DEBUG nova.compute.manager [req-1863c5a0-e679-4e6a-b515-6729053dab9c req-d0b46889-4644-4514-b7dc-534908f1cbef 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received event network-vif-plugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:01 compute-0 nova_compute[181978]: 2026-01-12 13:47:01.739 181991 DEBUG oslo_concurrency.lockutils [req-1863c5a0-e679-4e6a-b515-6729053dab9c req-d0b46889-4644-4514-b7dc-534908f1cbef 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:01 compute-0 nova_compute[181978]: 2026-01-12 13:47:01.739 181991 DEBUG oslo_concurrency.lockutils [req-1863c5a0-e679-4e6a-b515-6729053dab9c req-d0b46889-4644-4514-b7dc-534908f1cbef 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:01 compute-0 nova_compute[181978]: 2026-01-12 13:47:01.739 181991 DEBUG oslo_concurrency.lockutils [req-1863c5a0-e679-4e6a-b515-6729053dab9c req-d0b46889-4644-4514-b7dc-534908f1cbef 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:01 compute-0 nova_compute[181978]: 2026-01-12 13:47:01.740 181991 DEBUG nova.compute.manager [req-1863c5a0-e679-4e6a-b515-6729053dab9c req-d0b46889-4644-4514-b7dc-534908f1cbef 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] No waiting events found dispatching network-vif-plugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:47:01 compute-0 nova_compute[181978]: 2026-01-12 13:47:01.740 181991 WARNING nova.compute.manager [req-1863c5a0-e679-4e6a-b515-6729053dab9c req-d0b46889-4644-4514-b7dc-534908f1cbef 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received unexpected event network-vif-plugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 for instance with vm_state active and task_state None.
Jan 12 13:47:02 compute-0 nova_compute[181978]: 2026-01-12 13:47:02.186 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:02.446 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:47:02 compute-0 nova_compute[181978]: 2026-01-12 13:47:02.447 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:02.447 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:47:03 compute-0 podman[211493]: 2026-01-12 13:47:03.553339101 +0000 UTC m=+0.047986012 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 12 13:47:03 compute-0 nova_compute[181978]: 2026-01-12 13:47:03.616 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:03 compute-0 nova_compute[181978]: 2026-01-12 13:47:03.821 181991 DEBUG nova.compute.manager [req-41b3a715-5c21-4505-a598-47a278f05b04 req-af198e31-a8eb-4071-9ee7-ddc78c1aeb5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received event network-changed-4291c6e9-3942-4177-a9d7-9de0f6c79518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:03 compute-0 nova_compute[181978]: 2026-01-12 13:47:03.822 181991 DEBUG nova.compute.manager [req-41b3a715-5c21-4505-a598-47a278f05b04 req-af198e31-a8eb-4071-9ee7-ddc78c1aeb5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Refreshing instance network info cache due to event network-changed-4291c6e9-3942-4177-a9d7-9de0f6c79518. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:47:03 compute-0 nova_compute[181978]: 2026-01-12 13:47:03.822 181991 DEBUG oslo_concurrency.lockutils [req-41b3a715-5c21-4505-a598-47a278f05b04 req-af198e31-a8eb-4071-9ee7-ddc78c1aeb5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-0f7f449a-6fda-4afe-93ed-05763a43015b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:47:03 compute-0 nova_compute[181978]: 2026-01-12 13:47:03.822 181991 DEBUG oslo_concurrency.lockutils [req-41b3a715-5c21-4505-a598-47a278f05b04 req-af198e31-a8eb-4071-9ee7-ddc78c1aeb5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-0f7f449a-6fda-4afe-93ed-05763a43015b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:47:03 compute-0 nova_compute[181978]: 2026-01-12 13:47:03.823 181991 DEBUG nova.network.neutron [req-41b3a715-5c21-4505-a598-47a278f05b04 req-af198e31-a8eb-4071-9ee7-ddc78c1aeb5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Refreshing network info cache for port 4291c6e9-3942-4177-a9d7-9de0f6c79518 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:47:04 compute-0 nova_compute[181978]: 2026-01-12 13:47:04.817 181991 DEBUG nova.network.neutron [req-41b3a715-5c21-4505-a598-47a278f05b04 req-af198e31-a8eb-4071-9ee7-ddc78c1aeb5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Updated VIF entry in instance network info cache for port 4291c6e9-3942-4177-a9d7-9de0f6c79518. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:47:04 compute-0 nova_compute[181978]: 2026-01-12 13:47:04.818 181991 DEBUG nova.network.neutron [req-41b3a715-5c21-4505-a598-47a278f05b04 req-af198e31-a8eb-4071-9ee7-ddc78c1aeb5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Updating instance_info_cache with network_info: [{"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:47:04 compute-0 nova_compute[181978]: 2026-01-12 13:47:04.833 181991 DEBUG oslo_concurrency.lockutils [req-41b3a715-5c21-4505-a598-47a278f05b04 req-af198e31-a8eb-4071-9ee7-ddc78c1aeb5e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-0f7f449a-6fda-4afe-93ed-05763a43015b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:47:07 compute-0 nova_compute[181978]: 2026-01-12 13:47:07.187 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:08 compute-0 nova_compute[181978]: 2026-01-12 13:47:08.617 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:10 compute-0 ovn_controller[94974]: 2026-01-12T13:47:10Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:06:42:5a 10.100.0.10
Jan 12 13:47:10 compute-0 ovn_controller[94974]: 2026-01-12T13:47:10Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:06:42:5a 10.100.0.10
Jan 12 13:47:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:11.448 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:12 compute-0 nova_compute[181978]: 2026-01-12 13:47:12.189 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:13 compute-0 podman[211523]: 2026-01-12 13:47:13.569477454 +0000 UTC m=+0.054894897 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 12 13:47:13 compute-0 podman[211521]: 2026-01-12 13:47:13.578199896 +0000 UTC m=+0.069098331 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 12 13:47:13 compute-0 podman[211522]: 2026-01-12 13:47:13.582443031 +0000 UTC m=+0.070626682 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 12 13:47:13 compute-0 nova_compute[181978]: 2026-01-12 13:47:13.618 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:16 compute-0 nova_compute[181978]: 2026-01-12 13:47:16.838 181991 INFO nova.compute.manager [None req-358e1640-c122-4903-a456-4da68240ebe6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Get console output
Jan 12 13:47:16 compute-0 nova_compute[181978]: 2026-01-12 13:47:16.841 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.055 181991 DEBUG oslo_concurrency.lockutils [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "0f7f449a-6fda-4afe-93ed-05763a43015b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.055 181991 DEBUG oslo_concurrency.lockutils [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.055 181991 DEBUG oslo_concurrency.lockutils [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.056 181991 DEBUG oslo_concurrency.lockutils [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.056 181991 DEBUG oslo_concurrency.lockutils [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.057 181991 INFO nova.compute.manager [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Terminating instance
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.057 181991 DEBUG nova.compute.manager [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:47:17 compute-0 kernel: tap4291c6e9-39 (unregistering): left promiscuous mode
Jan 12 13:47:17 compute-0 NetworkManager[55211]: <info>  [1768225637.0801] device (tap4291c6e9-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:47:17 compute-0 ovn_controller[94974]: 2026-01-12T13:47:17Z|00085|binding|INFO|Releasing lport 4291c6e9-3942-4177-a9d7-9de0f6c79518 from this chassis (sb_readonly=0)
Jan 12 13:47:17 compute-0 ovn_controller[94974]: 2026-01-12T13:47:17Z|00086|binding|INFO|Setting lport 4291c6e9-3942-4177-a9d7-9de0f6c79518 down in Southbound
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.085 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 ovn_controller[94974]: 2026-01-12T13:47:17Z|00087|binding|INFO|Removing iface tap4291c6e9-39 ovn-installed in OVS
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.087 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.091 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:42:5a 10.100.0.10'], port_security=['fa:16:3e:06:42:5a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '0f7f449a-6fda-4afe-93ed-05763a43015b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33762d47-b42a-4231-ac6f-7bb9f318bc57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd27fee5-c567-45b5-8a69-ab9b5802587e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=4291c6e9-3942-4177-a9d7-9de0f6c79518) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.092 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 4291c6e9-3942-4177-a9d7-9de0f6c79518 in datapath a2132c96-bfe0-4c64-a5b0-a3df61a88e5d unbound from our chassis
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.093 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2132c96-bfe0-4c64-a5b0-a3df61a88e5d
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.099 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.106 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[077bf4d5-1dfa-4b82-8665-e81918100abb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 12 13:47:17 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 11.148s CPU time.
Jan 12 13:47:17 compute-0 systemd-machined[153581]: Machine qemu-5-instance-00000005 terminated.
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.126 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[964409db-def9-422d-b5a2-4ab92025eb7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.128 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[c1934e4b-9d20-4e94-808f-737cfec60144]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.144 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[51cac90f-be81-44fd-a8b5-c3cc7f4dcf9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.158 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d88fd361-ded8-4031-97e5-72c20ab9f54b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2132c96-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:c9:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 265578, 'reachable_time': 22598, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211595, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.168 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[7982d3d0-4d59-4593-aed5-86cc668d6b42]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa2132c96-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 265585, 'tstamp': 265585}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211596, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa2132c96-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 265586, 'tstamp': 265586}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211596, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.169 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2132c96-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.170 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.172 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.173 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2132c96-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.173 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.174 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2132c96-b0, col_values=(('external_ids', {'iface-id': '9c5265ef-1958-4964-ae02-09e78713440d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.174 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.190 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 kernel: tap4291c6e9-39: entered promiscuous mode
Jan 12 13:47:17 compute-0 systemd-udevd[211587]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:47:17 compute-0 NetworkManager[55211]: <info>  [1768225637.2711] manager: (tap4291c6e9-39): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 12 13:47:17 compute-0 ovn_controller[94974]: 2026-01-12T13:47:17Z|00088|binding|INFO|Claiming lport 4291c6e9-3942-4177-a9d7-9de0f6c79518 for this chassis.
Jan 12 13:47:17 compute-0 kernel: tap4291c6e9-39 (unregistering): left promiscuous mode
Jan 12 13:47:17 compute-0 ovn_controller[94974]: 2026-01-12T13:47:17Z|00089|binding|INFO|4291c6e9-3942-4177-a9d7-9de0f6c79518: Claiming fa:16:3e:06:42:5a 10.100.0.10
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.272 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.278 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:42:5a 10.100.0.10'], port_security=['fa:16:3e:06:42:5a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '0f7f449a-6fda-4afe-93ed-05763a43015b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33762d47-b42a-4231-ac6f-7bb9f318bc57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd27fee5-c567-45b5-8a69-ab9b5802587e, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=4291c6e9-3942-4177-a9d7-9de0f6c79518) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.279 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 4291c6e9-3942-4177-a9d7-9de0f6c79518 in datapath a2132c96-bfe0-4c64-a5b0-a3df61a88e5d bound to our chassis
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.280 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2132c96-bfe0-4c64-a5b0-a3df61a88e5d
Jan 12 13:47:17 compute-0 ovn_controller[94974]: 2026-01-12T13:47:17Z|00090|binding|INFO|Releasing lport 4291c6e9-3942-4177-a9d7-9de0f6c79518 from this chassis (sb_readonly=0)
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.288 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.294 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:42:5a 10.100.0.10'], port_security=['fa:16:3e:06:42:5a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '0f7f449a-6fda-4afe-93ed-05763a43015b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33762d47-b42a-4231-ac6f-7bb9f318bc57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd27fee5-c567-45b5-8a69-ab9b5802587e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=4291c6e9-3942-4177-a9d7-9de0f6c79518) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.294 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[20f1bdd1-a23d-40c6-beb0-8356f052a1ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.305 181991 INFO nova.virt.libvirt.driver [-] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Instance destroyed successfully.
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.305 181991 DEBUG nova.objects.instance [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid 0f7f449a-6fda-4afe-93ed-05763a43015b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.316 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[54b02907-e9cc-416d-9552-4adcabcacc24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.317 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[76d7d6b7-a897-4694-bb0f-0ae7089accfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.328 181991 DEBUG nova.virt.libvirt.vif [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:46:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-800081421',display_name='tempest-TestNetworkBasicOps-server-800081421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-800081421',id=5,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIwjgUkzRWqYp8kk/KWyGtNFpJx7oY5uCKWdb7DBCPlGn05COvvPPwzIUA2o7WoO0vRwWCsSXSn+C0xoFKzoVekooyh9mCtLOB7x2srQsYV2X+b4fQ8XBjk9cCoTl7p20w==',key_name='tempest-TestNetworkBasicOps-802423183',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:47:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-qltmjv9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:47:00Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=0f7f449a-6fda-4afe-93ed-05763a43015b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.328 181991 DEBUG nova.network.os_vif_util [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "address": "fa:16:3e:06:42:5a", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4291c6e9-39", "ovs_interfaceid": "4291c6e9-3942-4177-a9d7-9de0f6c79518", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.329 181991 DEBUG nova.network.os_vif_util [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:42:5a,bridge_name='br-int',has_traffic_filtering=True,id=4291c6e9-3942-4177-a9d7-9de0f6c79518,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4291c6e9-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.329 181991 DEBUG os_vif [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:42:5a,bridge_name='br-int',has_traffic_filtering=True,id=4291c6e9-3942-4177-a9d7-9de0f6c79518,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4291c6e9-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.330 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.331 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4291c6e9-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.332 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.334 181991 DEBUG nova.compute.manager [req-ecc5083c-5bec-43e6-ad7d-cd7f7855551d req-85b36840-c7a0-4bd8-a2e7-0532c37e34fa 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received event network-vif-unplugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.334 181991 DEBUG oslo_concurrency.lockutils [req-ecc5083c-5bec-43e6-ad7d-cd7f7855551d req-85b36840-c7a0-4bd8-a2e7-0532c37e34fa 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.334 181991 DEBUG oslo_concurrency.lockutils [req-ecc5083c-5bec-43e6-ad7d-cd7f7855551d req-85b36840-c7a0-4bd8-a2e7-0532c37e34fa 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.334 181991 DEBUG oslo_concurrency.lockutils [req-ecc5083c-5bec-43e6-ad7d-cd7f7855551d req-85b36840-c7a0-4bd8-a2e7-0532c37e34fa 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.334 181991 DEBUG nova.compute.manager [req-ecc5083c-5bec-43e6-ad7d-cd7f7855551d req-85b36840-c7a0-4bd8-a2e7-0532c37e34fa 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] No waiting events found dispatching network-vif-unplugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.335 181991 DEBUG nova.compute.manager [req-ecc5083c-5bec-43e6-ad7d-cd7f7855551d req-85b36840-c7a0-4bd8-a2e7-0532c37e34fa 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received event network-vif-unplugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.335 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.337 181991 INFO os_vif [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:42:5a,bridge_name='br-int',has_traffic_filtering=True,id=4291c6e9-3942-4177-a9d7-9de0f6c79518,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4291c6e9-39')
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.336 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[39c956c4-3a60-4c4c-acd6-179e6544f858]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.337 181991 INFO nova.virt.libvirt.driver [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Deleting instance files /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b_del
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.337 181991 INFO nova.virt.libvirt.driver [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Deletion of /var/lib/nova/instances/0f7f449a-6fda-4afe-93ed-05763a43015b_del complete
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.349 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[42aa43c7-db67-4006-a5fa-77bf9fd5e67d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2132c96-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:c9:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 9, 'rx_bytes': 658, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 9, 'rx_bytes': 658, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 265578, 'reachable_time': 22598, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211614, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.360 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[a058f4a4-bf1b-413b-b583-5d62a6f30371]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa2132c96-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 265585, 'tstamp': 265585}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211615, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa2132c96-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 265586, 'tstamp': 265586}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211615, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.361 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2132c96-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.361 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.364 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2132c96-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.364 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.364 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2132c96-b0, col_values=(('external_ids', {'iface-id': '9c5265ef-1958-4964-ae02-09e78713440d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.365 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.366 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 4291c6e9-3942-4177-a9d7-9de0f6c79518 in datapath a2132c96-bfe0-4c64-a5b0-a3df61a88e5d unbound from our chassis
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.367 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2132c96-bfe0-4c64-a5b0-a3df61a88e5d
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.377 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[cedee0fe-e3d3-4e98-a703-685d5793da4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.397 181991 INFO nova.compute.manager [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.397 181991 DEBUG oslo.service.loopingcall [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.397 181991 DEBUG nova.compute.manager [-] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.397 181991 DEBUG nova.network.neutron [-] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.398 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[e892d38f-8799-4027-92d5-381d81748840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.400 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[e1363417-bd8d-4723-b844-4ef1486739c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.420 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[c5dcef53-cf99-4eaa-8797-89e981919f34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.432 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d9730ac0-c110-47af-964a-912ba943d5f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2132c96-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:c9:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 11, 'rx_bytes': 658, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 11, 'rx_bytes': 658, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 265578, 'reachable_time': 22598, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211621, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.444 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8ced2b8b-7324-46e0-bb72-8fe646072b5c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa2132c96-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 265585, 'tstamp': 265585}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211622, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa2132c96-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 265586, 'tstamp': 265586}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211622, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.445 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2132c96-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.446 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.448 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2132c96-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.449 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.449 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2132c96-b0, col_values=(('external_ids', {'iface-id': '9c5265ef-1958-4964-ae02-09e78713440d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:17.450 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.503 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.620 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.620 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquired lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.620 181991 DEBUG nova.network.neutron [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 12 13:47:17 compute-0 nova_compute[181978]: 2026-01-12 13:47:17.620 181991 DEBUG nova.objects.instance [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 38adfe16-dcdc-44a9-8c50-a051037d4bbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.385 181991 DEBUG nova.network.neutron [-] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.405 181991 INFO nova.compute.manager [-] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Took 1.01 seconds to deallocate network for instance.
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.444 181991 DEBUG oslo_concurrency.lockutils [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.444 181991 DEBUG oslo_concurrency.lockutils [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.486 181991 DEBUG nova.compute.manager [req-d07e64db-e50e-4ce3-ac23-c6a897322dc7 req-3ec74fb1-1ea0-4e1e-989b-f019b8e448c9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received event network-vif-deleted-4291c6e9-3942-4177-a9d7-9de0f6c79518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.505 181991 DEBUG nova.compute.provider_tree [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.518 181991 DEBUG nova.scheduler.client.report [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.536 181991 DEBUG oslo_concurrency.lockutils [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.558 181991 INFO nova.scheduler.client.report [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance 0f7f449a-6fda-4afe-93ed-05763a43015b
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.611 181991 DEBUG oslo_concurrency.lockutils [None req-de85f9eb-ac6d-4a71-b9e9-b056ec2d7c09 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.856 181991 DEBUG nova.network.neutron [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updating instance_info_cache with network_info: [{"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.869 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Releasing lock "refresh_cache-38adfe16-dcdc-44a9-8c50-a051037d4bbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.870 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.870 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:47:18 compute-0 nova_compute[181978]: 2026-01-12 13:47:18.870 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.415 181991 DEBUG nova.compute.manager [req-8f47a247-19b7-4fe0-a175-cabc4b9eec4d req-fdf21e66-8d98-412c-a7ac-b4500c43bc92 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received event network-vif-plugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.416 181991 DEBUG oslo_concurrency.lockutils [req-8f47a247-19b7-4fe0-a175-cabc4b9eec4d req-fdf21e66-8d98-412c-a7ac-b4500c43bc92 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.416 181991 DEBUG oslo_concurrency.lockutils [req-8f47a247-19b7-4fe0-a175-cabc4b9eec4d req-fdf21e66-8d98-412c-a7ac-b4500c43bc92 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.417 181991 DEBUG oslo_concurrency.lockutils [req-8f47a247-19b7-4fe0-a175-cabc4b9eec4d req-fdf21e66-8d98-412c-a7ac-b4500c43bc92 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "0f7f449a-6fda-4afe-93ed-05763a43015b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.417 181991 DEBUG nova.compute.manager [req-8f47a247-19b7-4fe0-a175-cabc4b9eec4d req-fdf21e66-8d98-412c-a7ac-b4500c43bc92 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] No waiting events found dispatching network-vif-plugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.417 181991 WARNING nova.compute.manager [req-8f47a247-19b7-4fe0-a175-cabc4b9eec4d req-fdf21e66-8d98-412c-a7ac-b4500c43bc92 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Received unexpected event network-vif-plugged-4291c6e9-3942-4177-a9d7-9de0f6c79518 for instance with vm_state deleted and task_state None.
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.500 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.500 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.501 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.501 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.540 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.586 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.587 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.632 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.840 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.841 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5581MB free_disk=73.35108947753906GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.841 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.841 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.886 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Instance 38adfe16-dcdc-44a9-8c50-a051037d4bbe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.887 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.887 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.920 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.929 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.940 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:47:19 compute-0 nova_compute[181978]: 2026-01-12 13:47:19.941 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.722 181991 DEBUG oslo_concurrency.lockutils [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.722 181991 DEBUG oslo_concurrency.lockutils [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.723 181991 DEBUG oslo_concurrency.lockutils [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.723 181991 DEBUG oslo_concurrency.lockutils [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.723 181991 DEBUG oslo_concurrency.lockutils [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.724 181991 INFO nova.compute.manager [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Terminating instance
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.725 181991 DEBUG nova.compute.manager [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:47:20 compute-0 kernel: tape7942da1-f8 (unregistering): left promiscuous mode
Jan 12 13:47:20 compute-0 NetworkManager[55211]: <info>  [1768225640.7492] device (tape7942da1-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:47:20 compute-0 ovn_controller[94974]: 2026-01-12T13:47:20Z|00091|binding|INFO|Releasing lport e7942da1-f887-4ac9-8a01-72673dab4bd2 from this chassis (sb_readonly=0)
Jan 12 13:47:20 compute-0 ovn_controller[94974]: 2026-01-12T13:47:20Z|00092|binding|INFO|Setting lport e7942da1-f887-4ac9-8a01-72673dab4bd2 down in Southbound
Jan 12 13:47:20 compute-0 ovn_controller[94974]: 2026-01-12T13:47:20Z|00093|binding|INFO|Removing iface tape7942da1-f8 ovn-installed in OVS
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.754 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.756 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.761 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:55:fe 10.100.0.5'], port_security=['fa:16:3e:c0:55:fe 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '38adfe16-dcdc-44a9-8c50-a051037d4bbe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b6c357cb-eeaf-4bbf-9955-c174b2707487', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd27fee5-c567-45b5-8a69-ab9b5802587e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=e7942da1-f887-4ac9-8a01-72673dab4bd2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.761 104189 INFO neutron.agent.ovn.metadata.agent [-] Port e7942da1-f887-4ac9-8a01-72673dab4bd2 in datapath a2132c96-bfe0-4c64-a5b0-a3df61a88e5d unbound from our chassis
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.762 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2132c96-bfe0-4c64-a5b0-a3df61a88e5d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.763 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1e548bf7-5582-4088-8d77-2abdca8f615a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.763 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d namespace which is not needed anymore
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.778 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:20 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 12 13:47:20 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 11.681s CPU time.
Jan 12 13:47:20 compute-0 systemd-machined[153581]: Machine qemu-4-instance-00000004 terminated.
Jan 12 13:47:20 compute-0 podman[211630]: 2026-01-12 13:47:20.814435169 +0000 UTC m=+0.050854122 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 12 13:47:20 compute-0 neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d[211287]: [NOTICE]   (211291) : haproxy version is 2.8.14-c23fe91
Jan 12 13:47:20 compute-0 neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d[211287]: [NOTICE]   (211291) : path to executable is /usr/sbin/haproxy
Jan 12 13:47:20 compute-0 neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d[211287]: [WARNING]  (211291) : Exiting Master process...
Jan 12 13:47:20 compute-0 neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d[211287]: [ALERT]    (211291) : Current worker (211293) exited with code 143 (Terminated)
Jan 12 13:47:20 compute-0 neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d[211287]: [WARNING]  (211291) : All workers exited. Exiting... (0)
Jan 12 13:47:20 compute-0 systemd[1]: libpod-a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a.scope: Deactivated successfully.
Jan 12 13:47:20 compute-0 podman[211667]: 2026-01-12 13:47:20.867000747 +0000 UTC m=+0.031941289 container died a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 12 13:47:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a-userdata-shm.mount: Deactivated successfully.
Jan 12 13:47:20 compute-0 podman[211667]: 2026-01-12 13:47:20.887672546 +0000 UTC m=+0.052613068 container cleanup a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 12 13:47:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-b81441870910472b0a4ec2d2bc5f5e65c6fb651eb92740dc93c68fed778087cd-merged.mount: Deactivated successfully.
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.899 181991 DEBUG nova.compute.manager [req-d3f0a781-adfb-490f-b110-6a96c734d44b req-7e8da43d-4546-46e4-a10b-7e983abf166e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received event network-vif-unplugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.899 181991 DEBUG oslo_concurrency.lockutils [req-d3f0a781-adfb-490f-b110-6a96c734d44b req-7e8da43d-4546-46e4-a10b-7e983abf166e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.899 181991 DEBUG oslo_concurrency.lockutils [req-d3f0a781-adfb-490f-b110-6a96c734d44b req-7e8da43d-4546-46e4-a10b-7e983abf166e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.899 181991 DEBUG oslo_concurrency.lockutils [req-d3f0a781-adfb-490f-b110-6a96c734d44b req-7e8da43d-4546-46e4-a10b-7e983abf166e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.899 181991 DEBUG nova.compute.manager [req-d3f0a781-adfb-490f-b110-6a96c734d44b req-7e8da43d-4546-46e4-a10b-7e983abf166e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] No waiting events found dispatching network-vif-unplugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.900 181991 DEBUG nova.compute.manager [req-d3f0a781-adfb-490f-b110-6a96c734d44b req-7e8da43d-4546-46e4-a10b-7e983abf166e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received event network-vif-unplugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:47:20 compute-0 systemd[1]: libpod-conmon-a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a.scope: Deactivated successfully.
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.940 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.940 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:47:20 compute-0 podman[211690]: 2026-01-12 13:47:20.941577811 +0000 UTC m=+0.025409854 container remove a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.946 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba80ec1-8a78-448e-9570-3588bcf07547]: (4, ('Mon Jan 12 01:47:20 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d (a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a)\na541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a\nMon Jan 12 01:47:20 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d (a541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a)\na541f457175d9df672f587468aaa1ce462ef24d89c1c34856a0e737a113e224a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.947 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[c0cc6fbd-28b6-4418-8bfc-13b37e12a65b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.948 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2132c96-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:20 compute-0 kernel: tapa2132c96-b0: left promiscuous mode
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.951 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.964 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.965 181991 INFO nova.virt.libvirt.driver [-] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Instance destroyed successfully.
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.966 181991 DEBUG nova.objects.instance [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid 38adfe16-dcdc-44a9-8c50-a051037d4bbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.967 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.966 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc776ea-09aa-4a9d-9b42-83fcbb1df816]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.975 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[37cefa28-cf59-4e6d-96ba-ff59d63d069c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.975 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[c9103a74-ef02-4266-937e-f307916ad9b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.976 181991 DEBUG nova.virt.libvirt.vif [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:46:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1047315553',display_name='tempest-TestNetworkBasicOps-server-1047315553',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1047315553',id=4,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPYa+Z2CAObwJIdbzfDpkgludr9WfwFuPqwsPBX/YPYuQGJOjCo7MNviASGyDeZ3aFWFsbV/fgSZjrzS4BP5rPhMyLWt1gImaMrn3S/xdh0c/z6cvY2QHWueSS6KdTeIhw==',key_name='tempest-TestNetworkBasicOps-838323751',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:46:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-cxt50auh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:46:30Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=38adfe16-dcdc-44a9-8c50-a051037d4bbe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.976 181991 DEBUG nova.network.os_vif_util [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "address": "fa:16:3e:c0:55:fe", "network": {"id": "a2132c96-bfe0-4c64-a5b0-a3df61a88e5d", "bridge": "br-int", "label": "tempest-network-smoke--975181351", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7942da1-f8", "ovs_interfaceid": "e7942da1-f887-4ac9-8a01-72673dab4bd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.977 181991 DEBUG nova.network.os_vif_util [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:55:fe,bridge_name='br-int',has_traffic_filtering=True,id=e7942da1-f887-4ac9-8a01-72673dab4bd2,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7942da1-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.977 181991 DEBUG os_vif [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:55:fe,bridge_name='br-int',has_traffic_filtering=True,id=e7942da1-f887-4ac9-8a01-72673dab4bd2,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7942da1-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.978 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.978 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7942da1-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.980 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.981 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.982 181991 INFO os_vif [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:55:fe,bridge_name='br-int',has_traffic_filtering=True,id=e7942da1-f887-4ac9-8a01-72673dab4bd2,network=Network(a2132c96-bfe0-4c64-a5b0-a3df61a88e5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7942da1-f8')
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.982 181991 INFO nova.virt.libvirt.driver [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Deleting instance files /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe_del
Jan 12 13:47:20 compute-0 nova_compute[181978]: 2026-01-12 13:47:20.983 181991 INFO nova.virt.libvirt.driver [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Deletion of /var/lib/nova/instances/38adfe16-dcdc-44a9-8c50-a051037d4bbe_del complete
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.986 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[f79ea19d-9302-4150-879d-3c073b4a8e98]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 265573, 'reachable_time': 25349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211721, 'error': None, 'target': 'ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:20 compute-0 systemd[1]: run-netns-ovnmeta\x2da2132c96\x2dbfe0\x2d4c64\x2da5b0\x2da3df61a88e5d.mount: Deactivated successfully.
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.989 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a2132c96-bfe0-4c64-a5b0-a3df61a88e5d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:47:20 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:20.989 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[0a032629-b591-40e0-965a-b77c0aaa9808]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.014 181991 INFO nova.compute.manager [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Took 0.29 seconds to destroy the instance on the hypervisor.
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.015 181991 DEBUG oslo.service.loopingcall [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.015 181991 DEBUG nova.compute.manager [-] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.015 181991 DEBUG nova.network.neutron [-] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.476 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.528 181991 DEBUG nova.network.neutron [-] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.537 181991 INFO nova.compute.manager [-] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Took 0.52 seconds to deallocate network for instance.
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.566 181991 DEBUG oslo_concurrency.lockutils [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.566 181991 DEBUG oslo_concurrency.lockutils [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.602 181991 DEBUG nova.compute.provider_tree [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.610 181991 DEBUG nova.scheduler.client.report [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.622 181991 DEBUG oslo_concurrency.lockutils [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.639 181991 INFO nova.scheduler.client.report [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance 38adfe16-dcdc-44a9-8c50-a051037d4bbe
Jan 12 13:47:21 compute-0 nova_compute[181978]: 2026-01-12 13:47:21.691 181991 DEBUG oslo_concurrency.lockutils [None req-26bf4450-69f3-449a-b3de-468eff46d5d6 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.191 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.479 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.972 181991 DEBUG nova.compute.manager [req-8ea2464c-130e-4f91-b7d0-f4b15d0ed148 req-d162e444-2132-4682-b7cb-4eeffee5d569 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received event network-vif-plugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.973 181991 DEBUG oslo_concurrency.lockutils [req-8ea2464c-130e-4f91-b7d0-f4b15d0ed148 req-d162e444-2132-4682-b7cb-4eeffee5d569 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.973 181991 DEBUG oslo_concurrency.lockutils [req-8ea2464c-130e-4f91-b7d0-f4b15d0ed148 req-d162e444-2132-4682-b7cb-4eeffee5d569 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.973 181991 DEBUG oslo_concurrency.lockutils [req-8ea2464c-130e-4f91-b7d0-f4b15d0ed148 req-d162e444-2132-4682-b7cb-4eeffee5d569 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "38adfe16-dcdc-44a9-8c50-a051037d4bbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.973 181991 DEBUG nova.compute.manager [req-8ea2464c-130e-4f91-b7d0-f4b15d0ed148 req-d162e444-2132-4682-b7cb-4eeffee5d569 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] No waiting events found dispatching network-vif-plugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.973 181991 WARNING nova.compute.manager [req-8ea2464c-130e-4f91-b7d0-f4b15d0ed148 req-d162e444-2132-4682-b7cb-4eeffee5d569 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received unexpected event network-vif-plugged-e7942da1-f887-4ac9-8a01-72673dab4bd2 for instance with vm_state deleted and task_state None.
Jan 12 13:47:22 compute-0 nova_compute[181978]: 2026-01-12 13:47:22.974 181991 DEBUG nova.compute.manager [req-8ea2464c-130e-4f91-b7d0-f4b15d0ed148 req-d162e444-2132-4682-b7cb-4eeffee5d569 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Received event network-vif-deleted-e7942da1-f887-4ac9-8a01-72673dab4bd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:25 compute-0 nova_compute[181978]: 2026-01-12 13:47:25.979 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:27 compute-0 nova_compute[181978]: 2026-01-12 13:47:27.193 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:27 compute-0 nova_compute[181978]: 2026-01-12 13:47:27.313 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:27 compute-0 nova_compute[181978]: 2026-01-12 13:47:27.387 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:30 compute-0 nova_compute[181978]: 2026-01-12 13:47:30.981 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:31 compute-0 podman[211724]: 2026-01-12 13:47:31.543766584 +0000 UTC m=+0.037569316 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 12 13:47:32 compute-0 nova_compute[181978]: 2026-01-12 13:47:32.195 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:32 compute-0 nova_compute[181978]: 2026-01-12 13:47:32.303 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225637.3031783, 0f7f449a-6fda-4afe-93ed-05763a43015b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:47:32 compute-0 nova_compute[181978]: 2026-01-12 13:47:32.304 181991 INFO nova.compute.manager [-] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] VM Stopped (Lifecycle Event)
Jan 12 13:47:32 compute-0 nova_compute[181978]: 2026-01-12 13:47:32.321 181991 DEBUG nova.compute.manager [None req-657a33ea-f78a-4ad6-97ed-fb800d4933c1 - - - - - -] [instance: 0f7f449a-6fda-4afe-93ed-05763a43015b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:34 compute-0 podman[211745]: 2026-01-12 13:47:34.554586519 +0000 UTC m=+0.043354258 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 12 13:47:35 compute-0 nova_compute[181978]: 2026-01-12 13:47:35.962 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225640.9592416, 38adfe16-dcdc-44a9-8c50-a051037d4bbe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:47:35 compute-0 nova_compute[181978]: 2026-01-12 13:47:35.963 181991 INFO nova.compute.manager [-] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] VM Stopped (Lifecycle Event)
Jan 12 13:47:35 compute-0 nova_compute[181978]: 2026-01-12 13:47:35.979 181991 DEBUG nova.compute.manager [None req-10d57079-e007-401f-94a2-bdfab6038501 - - - - - -] [instance: 38adfe16-dcdc-44a9-8c50-a051037d4bbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:35 compute-0 nova_compute[181978]: 2026-01-12 13:47:35.982 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:37 compute-0 nova_compute[181978]: 2026-01-12 13:47:37.197 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.254 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.254 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.268 181991 DEBUG nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.335 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.335 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.340 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.341 181991 INFO nova.compute.claims [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.415 181991 DEBUG nova.compute.provider_tree [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.425 181991 DEBUG nova.scheduler.client.report [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.437 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.438 181991 DEBUG nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.470 181991 DEBUG nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.470 181991 DEBUG nova.network.neutron [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.480 181991 INFO nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.490 181991 DEBUG nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.551 181991 DEBUG nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.551 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.552 181991 INFO nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Creating image(s)
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.552 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.552 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.553 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.562 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.604 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.605 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.605 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.614 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.654 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.655 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.668 181991 DEBUG nova.policy [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.671 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk 1073741824" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.671 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.672 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.712 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.713 181991 DEBUG nova.virt.disk.api [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.713 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.754 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.754 181991 DEBUG nova.virt.disk.api [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.755 181991 DEBUG nova.objects.instance [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid e21c2b66-4a73-4093-b44b-c47371cf431e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.768 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.768 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Ensure instance console log exists: /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.769 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.769 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:38 compute-0 nova_compute[181978]: 2026-01-12 13:47:38.769 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:39 compute-0 nova_compute[181978]: 2026-01-12 13:47:39.175 181991 DEBUG nova.network.neutron [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Successfully created port: f4b71a00-88ba-4e02-82f2-54866d84d7bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:47:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:40.200 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:40.200 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:40.200 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:40 compute-0 nova_compute[181978]: 2026-01-12 13:47:40.317 181991 DEBUG nova.network.neutron [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Successfully updated port: f4b71a00-88ba-4e02-82f2-54866d84d7bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:47:40 compute-0 nova_compute[181978]: 2026-01-12 13:47:40.334 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:47:40 compute-0 nova_compute[181978]: 2026-01-12 13:47:40.334 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:47:40 compute-0 nova_compute[181978]: 2026-01-12 13:47:40.335 181991 DEBUG nova.network.neutron [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:47:40 compute-0 nova_compute[181978]: 2026-01-12 13:47:40.412 181991 DEBUG nova.compute.manager [req-fabd1bf2-6946-4e32-a6b9-83051633316f req-b3fe7d8d-375e-46fa-a978-7ab50c2c8ea8 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-changed-f4b71a00-88ba-4e02-82f2-54866d84d7bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:40 compute-0 nova_compute[181978]: 2026-01-12 13:47:40.413 181991 DEBUG nova.compute.manager [req-fabd1bf2-6946-4e32-a6b9-83051633316f req-b3fe7d8d-375e-46fa-a978-7ab50c2c8ea8 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing instance network info cache due to event network-changed-f4b71a00-88ba-4e02-82f2-54866d84d7bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:47:40 compute-0 nova_compute[181978]: 2026-01-12 13:47:40.413 181991 DEBUG oslo_concurrency.lockutils [req-fabd1bf2-6946-4e32-a6b9-83051633316f req-b3fe7d8d-375e-46fa-a978-7ab50c2c8ea8 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:47:40 compute-0 nova_compute[181978]: 2026-01-12 13:47:40.458 181991 DEBUG nova.network.neutron [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:47:40 compute-0 nova_compute[181978]: 2026-01-12 13:47:40.983 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.520 181991 DEBUG nova.network.neutron [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.535 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.535 181991 DEBUG nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Instance network_info: |[{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.536 181991 DEBUG oslo_concurrency.lockutils [req-fabd1bf2-6946-4e32-a6b9-83051633316f req-b3fe7d8d-375e-46fa-a978-7ab50c2c8ea8 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.536 181991 DEBUG nova.network.neutron [req-fabd1bf2-6946-4e32-a6b9-83051633316f req-b3fe7d8d-375e-46fa-a978-7ab50c2c8ea8 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing network info cache for port f4b71a00-88ba-4e02-82f2-54866d84d7bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.538 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Start _get_guest_xml network_info=[{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.542 181991 WARNING nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.549 181991 DEBUG nova.virt.libvirt.host [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.549 181991 DEBUG nova.virt.libvirt.host [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.552 181991 DEBUG nova.virt.libvirt.host [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.552 181991 DEBUG nova.virt.libvirt.host [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.553 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.553 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.553 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.553 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.554 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.554 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.554 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.554 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.554 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.554 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.555 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.555 181991 DEBUG nova.virt.hardware [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.557 181991 DEBUG nova.virt.libvirt.vif [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:47:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1564557670',display_name='tempest-TestNetworkBasicOps-server-1564557670',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1564557670',id=6,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF/1bsGLeQnmdFaxag5upZGX8c2nEffySj/4Q7V/vijQLcjUXrGhri7z9WjVl1StDm/8dFJgv2Bx084i0GN8hlc/x3+ywRJcfhYEbagbfonLagAIsEOT3tS68tmGbqIsmw==',key_name='tempest-TestNetworkBasicOps-1099157907',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-j3tw249e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:47:38Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=e21c2b66-4a73-4093-b44b-c47371cf431e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.558 181991 DEBUG nova.network.os_vif_util [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.558 181991 DEBUG nova.network.os_vif_util [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:61:fe,bridge_name='br-int',has_traffic_filtering=True,id=f4b71a00-88ba-4e02-82f2-54866d84d7bd,network=Network(956086f6-7f4d-41c7-b756-f2665bee9e93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4b71a00-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.559 181991 DEBUG nova.objects.instance [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid e21c2b66-4a73-4093-b44b-c47371cf431e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.570 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <uuid>e21c2b66-4a73-4093-b44b-c47371cf431e</uuid>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <name>instance-00000006</name>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-1564557670</nova:name>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:47:41</nova:creationTime>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:47:41 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:47:41 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:47:41 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:47:41 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:47:41 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:47:41 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:47:41 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:47:41 compute-0 nova_compute[181978]:         <nova:port uuid="f4b71a00-88ba-4e02-82f2-54866d84d7bd">
Jan 12 13:47:41 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <system>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <entry name="serial">e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <entry name="uuid">e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     </system>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <os>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   </os>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <features>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   </features>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.config"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:86:61:fe"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <target dev="tapf4b71a00-88"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log" append="off"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <video>
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     </video>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:47:41 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:47:41 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:47:41 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:47:41 compute-0 nova_compute[181978]: </domain>
Jan 12 13:47:41 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.570 181991 DEBUG nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Preparing to wait for external event network-vif-plugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.571 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.571 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.571 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.572 181991 DEBUG nova.virt.libvirt.vif [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:47:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1564557670',display_name='tempest-TestNetworkBasicOps-server-1564557670',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1564557670',id=6,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF/1bsGLeQnmdFaxag5upZGX8c2nEffySj/4Q7V/vijQLcjUXrGhri7z9WjVl1StDm/8dFJgv2Bx084i0GN8hlc/x3+ywRJcfhYEbagbfonLagAIsEOT3tS68tmGbqIsmw==',key_name='tempest-TestNetworkBasicOps-1099157907',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-j3tw249e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:47:38Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=e21c2b66-4a73-4093-b44b-c47371cf431e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.572 181991 DEBUG nova.network.os_vif_util [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.572 181991 DEBUG nova.network.os_vif_util [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:61:fe,bridge_name='br-int',has_traffic_filtering=True,id=f4b71a00-88ba-4e02-82f2-54866d84d7bd,network=Network(956086f6-7f4d-41c7-b756-f2665bee9e93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4b71a00-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.572 181991 DEBUG os_vif [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:61:fe,bridge_name='br-int',has_traffic_filtering=True,id=f4b71a00-88ba-4e02-82f2-54866d84d7bd,network=Network(956086f6-7f4d-41c7-b756-f2665bee9e93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4b71a00-88') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.573 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.573 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.573 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.576 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.576 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4b71a00-88, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.576 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf4b71a00-88, col_values=(('external_ids', {'iface-id': 'f4b71a00-88ba-4e02-82f2-54866d84d7bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:61:fe', 'vm-uuid': 'e21c2b66-4a73-4093-b44b-c47371cf431e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.577 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:41 compute-0 NetworkManager[55211]: <info>  [1768225661.5789] manager: (tapf4b71a00-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.579 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.580 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.581 181991 INFO os_vif [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:61:fe,bridge_name='br-int',has_traffic_filtering=True,id=f4b71a00-88ba-4e02-82f2-54866d84d7bd,network=Network(956086f6-7f4d-41c7-b756-f2665bee9e93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4b71a00-88')
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.613 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.613 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.614 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:86:61:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:47:41 compute-0 nova_compute[181978]: 2026-01-12 13:47:41.614 181991 INFO nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Using config drive
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.199 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.281 181991 INFO nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Creating config drive at /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.config
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.285 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr8l18dlx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.400 181991 DEBUG oslo_concurrency.processutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr8l18dlx" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:47:42 compute-0 NetworkManager[55211]: <info>  [1768225662.4303] manager: (tapf4b71a00-88): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 12 13:47:42 compute-0 kernel: tapf4b71a00-88: entered promiscuous mode
Jan 12 13:47:42 compute-0 ovn_controller[94974]: 2026-01-12T13:47:42Z|00094|binding|INFO|Claiming lport f4b71a00-88ba-4e02-82f2-54866d84d7bd for this chassis.
Jan 12 13:47:42 compute-0 ovn_controller[94974]: 2026-01-12T13:47:42Z|00095|binding|INFO|f4b71a00-88ba-4e02-82f2-54866d84d7bd: Claiming fa:16:3e:86:61:fe 10.100.0.14
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.433 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.443 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:61:fe 10.100.0.14'], port_security=['fa:16:3e:86:61:fe 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e21c2b66-4a73-4093-b44b-c47371cf431e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-956086f6-7f4d-41c7-b756-f2665bee9e93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ac7c355d-263e-491c-8fb3-a7c4644a1471', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14e3122d-a14b-402a-a089-41556bf5f4e9, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=f4b71a00-88ba-4e02-82f2-54866d84d7bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.443 104189 INFO neutron.agent.ovn.metadata.agent [-] Port f4b71a00-88ba-4e02-82f2-54866d84d7bd in datapath 956086f6-7f4d-41c7-b756-f2665bee9e93 bound to our chassis
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.444 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 956086f6-7f4d-41c7-b756-f2665bee9e93
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.451 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[2c5be4ed-f5de-4d0d-b578-04b7e39d0f4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.452 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap956086f6-71 in ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.453 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap956086f6-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.453 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[641ce386-731a-424a-ad60-7a97c84a01ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.454 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0551ad-7802-4076-939a-774b53a74ac9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 systemd-udevd[211797]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.464 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[81df239f-daca-4984-a1fa-59a0ae92b65e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 systemd-machined[153581]: New machine qemu-6-instance-00000006.
Jan 12 13:47:42 compute-0 NetworkManager[55211]: <info>  [1768225662.4701] device (tapf4b71a00-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:47:42 compute-0 NetworkManager[55211]: <info>  [1768225662.4706] device (tapf4b71a00-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.484 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe74548-c432-4645-91d4-9eab959aa4f9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.495 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:42 compute-0 ovn_controller[94974]: 2026-01-12T13:47:42Z|00096|binding|INFO|Setting lport f4b71a00-88ba-4e02-82f2-54866d84d7bd ovn-installed in OVS
Jan 12 13:47:42 compute-0 ovn_controller[94974]: 2026-01-12T13:47:42Z|00097|binding|INFO|Setting lport f4b71a00-88ba-4e02-82f2-54866d84d7bd up in Southbound
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.499 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.507 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[c56a45b5-737c-4013-bad1-36c0077fd912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.510 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[4b46e756-3f69-41a7-bf0d-0e705da6daff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 NetworkManager[55211]: <info>  [1768225662.5114] manager: (tap956086f6-70): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.531 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc815e7-b550-42e8-98c1-98d8c4712cfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.533 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[1925568b-8790-4e55-b401-698f1764d486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 NetworkManager[55211]: <info>  [1768225662.5492] device (tap956086f6-70): carrier: link connected
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.552 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[d08ed6f4-c199-4854-ada7-469798b3313b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.564 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3350c15c-7a62-4099-a1a9-02c66e05dc4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap956086f6-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:36:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 272857, 'reachable_time': 39713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211823, 'error': None, 'target': 'ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.575 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[2583b66b-71a3-4616-a4b7-a11f4428e08c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:365a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 272857, 'tstamp': 272857}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211824, 'error': None, 'target': 'ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.584 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[fe21cbf0-cf35-4ab4-9c27-9660f8222a24]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap956086f6-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:36:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 272857, 'reachable_time': 39713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211825, 'error': None, 'target': 'ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.602 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3df2200d-b149-45c9-a2d3-ef7854e7913d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.640 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[5bebc79b-37c0-4a6a-be60-59ecff0467a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.641 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap956086f6-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.641 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.642 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap956086f6-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:42 compute-0 kernel: tap956086f6-70: entered promiscuous mode
Jan 12 13:47:42 compute-0 NetworkManager[55211]: <info>  [1768225662.6438] manager: (tap956086f6-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.643 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.648 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap956086f6-70, col_values=(('external_ids', {'iface-id': 'c493e866-7a68-4689-83e5-56bf74dbaba7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.649 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:42 compute-0 ovn_controller[94974]: 2026-01-12T13:47:42Z|00098|binding|INFO|Releasing lport c493e866-7a68-4689-83e5-56bf74dbaba7 from this chassis (sb_readonly=0)
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.652 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/956086f6-7f4d-41c7-b756-f2665bee9e93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/956086f6-7f4d-41c7-b756-f2665bee9e93.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.661 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[72603a81-67d6-4404-b9b0-afe447d0d5f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.662 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-956086f6-7f4d-41c7-b756-f2665bee9e93
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/956086f6-7f4d-41c7-b756-f2665bee9e93.pid.haproxy
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID 956086f6-7f4d-41c7-b756-f2665bee9e93
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.663 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:42 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:47:42.664 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93', 'env', 'PROCESS_TAG=haproxy-956086f6-7f4d-41c7-b756-f2665bee9e93', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/956086f6-7f4d-41c7-b756-f2665bee9e93.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.678 181991 DEBUG nova.compute.manager [req-e23b3c62-d780-43c3-9d30-5f828bf680f5 req-e99b3713-73d6-40e9-830a-f9a1c75120f7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-plugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.678 181991 DEBUG oslo_concurrency.lockutils [req-e23b3c62-d780-43c3-9d30-5f828bf680f5 req-e99b3713-73d6-40e9-830a-f9a1c75120f7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.679 181991 DEBUG oslo_concurrency.lockutils [req-e23b3c62-d780-43c3-9d30-5f828bf680f5 req-e99b3713-73d6-40e9-830a-f9a1c75120f7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.679 181991 DEBUG oslo_concurrency.lockutils [req-e23b3c62-d780-43c3-9d30-5f828bf680f5 req-e99b3713-73d6-40e9-830a-f9a1c75120f7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.679 181991 DEBUG nova.compute.manager [req-e23b3c62-d780-43c3-9d30-5f828bf680f5 req-e99b3713-73d6-40e9-830a-f9a1c75120f7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Processing event network-vif-plugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.835 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225662.83539, e21c2b66-4a73-4093-b44b-c47371cf431e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.836 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] VM Started (Lifecycle Event)
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.838 181991 DEBUG nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.841 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.843 181991 INFO nova.virt.libvirt.driver [-] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Instance spawned successfully.
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.843 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.854 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.858 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.860 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.860 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.861 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.861 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.861 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.862 181991 DEBUG nova.virt.libvirt.driver [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.879 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.879 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225662.8377938, e21c2b66-4a73-4093-b44b-c47371cf431e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.880 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] VM Paused (Lifecycle Event)
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.898 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.900 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225662.840193, e21c2b66-4a73-4093-b44b-c47371cf431e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.900 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] VM Resumed (Lifecycle Event)
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.908 181991 INFO nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Took 4.36 seconds to spawn the instance on the hypervisor.
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.908 181991 DEBUG nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.927 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.928 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.946 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:47:42 compute-0 podman[211860]: 2026-01-12 13:47:42.948705848 +0000 UTC m=+0.034044777 container create 2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.960 181991 INFO nova.compute.manager [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Took 4.65 seconds to build instance.
Jan 12 13:47:42 compute-0 nova_compute[181978]: 2026-01-12 13:47:42.969 181991 DEBUG oslo_concurrency.lockutils [None req-a48b7f80-f79c-4ad0-a095-235d3024c37f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.715s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:42 compute-0 systemd[1]: Started libpod-conmon-2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331.scope.
Jan 12 13:47:42 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:47:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5ac7b29713008e981bae8a0d73708d026ace941b38808da007f3c5b1e61a4978/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:47:43 compute-0 podman[211860]: 2026-01-12 13:47:43.011019105 +0000 UTC m=+0.096358054 container init 2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 12 13:47:43 compute-0 podman[211860]: 2026-01-12 13:47:43.016220158 +0000 UTC m=+0.101559097 container start 2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:47:43 compute-0 podman[211860]: 2026-01-12 13:47:42.933094364 +0000 UTC m=+0.018433313 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:47:43 compute-0 neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93[211872]: [NOTICE]   (211876) : New worker (211878) forked
Jan 12 13:47:43 compute-0 neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93[211872]: [NOTICE]   (211876) : Loading success.
Jan 12 13:47:44 compute-0 nova_compute[181978]: 2026-01-12 13:47:44.223 181991 DEBUG nova.network.neutron [req-fabd1bf2-6946-4e32-a6b9-83051633316f req-b3fe7d8d-375e-46fa-a978-7ab50c2c8ea8 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updated VIF entry in instance network info cache for port f4b71a00-88ba-4e02-82f2-54866d84d7bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:47:44 compute-0 nova_compute[181978]: 2026-01-12 13:47:44.224 181991 DEBUG nova.network.neutron [req-fabd1bf2-6946-4e32-a6b9-83051633316f req-b3fe7d8d-375e-46fa-a978-7ab50c2c8ea8 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:47:44 compute-0 nova_compute[181978]: 2026-01-12 13:47:44.238 181991 DEBUG oslo_concurrency.lockutils [req-fabd1bf2-6946-4e32-a6b9-83051633316f req-b3fe7d8d-375e-46fa-a978-7ab50c2c8ea8 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:47:44 compute-0 podman[211884]: 2026-01-12 13:47:44.558628026 +0000 UTC m=+0.050851739 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 12 13:47:44 compute-0 podman[211885]: 2026-01-12 13:47:44.574448924 +0000 UTC m=+0.063396615 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Jan 12 13:47:44 compute-0 podman[211883]: 2026-01-12 13:47:44.585055564 +0000 UTC m=+0.079683399 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Jan 12 13:47:44 compute-0 nova_compute[181978]: 2026-01-12 13:47:44.741 181991 DEBUG nova.compute.manager [req-efb3354f-2012-4055-be42-b22e65b92a05 req-f9281341-fc71-4c98-b076-57fa4a059d71 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-plugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:44 compute-0 nova_compute[181978]: 2026-01-12 13:47:44.742 181991 DEBUG oslo_concurrency.lockutils [req-efb3354f-2012-4055-be42-b22e65b92a05 req-f9281341-fc71-4c98-b076-57fa4a059d71 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:47:44 compute-0 nova_compute[181978]: 2026-01-12 13:47:44.742 181991 DEBUG oslo_concurrency.lockutils [req-efb3354f-2012-4055-be42-b22e65b92a05 req-f9281341-fc71-4c98-b076-57fa4a059d71 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:47:44 compute-0 nova_compute[181978]: 2026-01-12 13:47:44.742 181991 DEBUG oslo_concurrency.lockutils [req-efb3354f-2012-4055-be42-b22e65b92a05 req-f9281341-fc71-4c98-b076-57fa4a059d71 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:47:44 compute-0 nova_compute[181978]: 2026-01-12 13:47:44.742 181991 DEBUG nova.compute.manager [req-efb3354f-2012-4055-be42-b22e65b92a05 req-f9281341-fc71-4c98-b076-57fa4a059d71 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] No waiting events found dispatching network-vif-plugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:47:44 compute-0 nova_compute[181978]: 2026-01-12 13:47:44.743 181991 WARNING nova.compute.manager [req-efb3354f-2012-4055-be42-b22e65b92a05 req-f9281341-fc71-4c98-b076-57fa4a059d71 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received unexpected event network-vif-plugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd for instance with vm_state active and task_state None.
Jan 12 13:47:45 compute-0 nova_compute[181978]: 2026-01-12 13:47:45.720 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:45 compute-0 NetworkManager[55211]: <info>  [1768225665.7210] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 12 13:47:45 compute-0 NetworkManager[55211]: <info>  [1768225665.7215] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 12 13:47:45 compute-0 ovn_controller[94974]: 2026-01-12T13:47:45Z|00099|binding|INFO|Releasing lport c493e866-7a68-4689-83e5-56bf74dbaba7 from this chassis (sb_readonly=0)
Jan 12 13:47:45 compute-0 ovn_controller[94974]: 2026-01-12T13:47:45Z|00100|binding|INFO|Releasing lport c493e866-7a68-4689-83e5-56bf74dbaba7 from this chassis (sb_readonly=0)
Jan 12 13:47:45 compute-0 nova_compute[181978]: 2026-01-12 13:47:45.745 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:45 compute-0 nova_compute[181978]: 2026-01-12 13:47:45.747 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:46 compute-0 nova_compute[181978]: 2026-01-12 13:47:46.577 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:46 compute-0 nova_compute[181978]: 2026-01-12 13:47:46.814 181991 DEBUG nova.compute.manager [req-0e9a2281-ca4b-404f-9e4b-9c7db322949e req-e1a2ca88-68e8-4935-bb1f-feda49236585 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-changed-f4b71a00-88ba-4e02-82f2-54866d84d7bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:47:46 compute-0 nova_compute[181978]: 2026-01-12 13:47:46.814 181991 DEBUG nova.compute.manager [req-0e9a2281-ca4b-404f-9e4b-9c7db322949e req-e1a2ca88-68e8-4935-bb1f-feda49236585 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing instance network info cache due to event network-changed-f4b71a00-88ba-4e02-82f2-54866d84d7bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:47:46 compute-0 nova_compute[181978]: 2026-01-12 13:47:46.815 181991 DEBUG oslo_concurrency.lockutils [req-0e9a2281-ca4b-404f-9e4b-9c7db322949e req-e1a2ca88-68e8-4935-bb1f-feda49236585 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:47:46 compute-0 nova_compute[181978]: 2026-01-12 13:47:46.815 181991 DEBUG oslo_concurrency.lockutils [req-0e9a2281-ca4b-404f-9e4b-9c7db322949e req-e1a2ca88-68e8-4935-bb1f-feda49236585 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:47:46 compute-0 nova_compute[181978]: 2026-01-12 13:47:46.815 181991 DEBUG nova.network.neutron [req-0e9a2281-ca4b-404f-9e4b-9c7db322949e req-e1a2ca88-68e8-4935-bb1f-feda49236585 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing network info cache for port f4b71a00-88ba-4e02-82f2-54866d84d7bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:47:47 compute-0 nova_compute[181978]: 2026-01-12 13:47:47.198 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:47 compute-0 nova_compute[181978]: 2026-01-12 13:47:47.823 181991 DEBUG nova.network.neutron [req-0e9a2281-ca4b-404f-9e4b-9c7db322949e req-e1a2ca88-68e8-4935-bb1f-feda49236585 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updated VIF entry in instance network info cache for port f4b71a00-88ba-4e02-82f2-54866d84d7bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:47:47 compute-0 nova_compute[181978]: 2026-01-12 13:47:47.823 181991 DEBUG nova.network.neutron [req-0e9a2281-ca4b-404f-9e4b-9c7db322949e req-e1a2ca88-68e8-4935-bb1f-feda49236585 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:47:47 compute-0 nova_compute[181978]: 2026-01-12 13:47:47.838 181991 DEBUG oslo_concurrency.lockutils [req-0e9a2281-ca4b-404f-9e4b-9c7db322949e req-e1a2ca88-68e8-4935-bb1f-feda49236585 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:47:51 compute-0 podman[211949]: 2026-01-12 13:47:51.543408813 +0000 UTC m=+0.038185626 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 12 13:47:51 compute-0 nova_compute[181978]: 2026-01-12 13:47:51.579 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:52 compute-0 nova_compute[181978]: 2026-01-12 13:47:52.200 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:52 compute-0 ovn_controller[94974]: 2026-01-12T13:47:52Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:86:61:fe 10.100.0.14
Jan 12 13:47:52 compute-0 ovn_controller[94974]: 2026-01-12T13:47:52Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:61:fe 10.100.0.14
Jan 12 13:47:56 compute-0 nova_compute[181978]: 2026-01-12 13:47:56.579 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:57 compute-0 nova_compute[181978]: 2026-01-12 13:47:57.201 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:47:58 compute-0 nova_compute[181978]: 2026-01-12 13:47:58.182 181991 INFO nova.compute.manager [None req-06357fa0-1fcf-4d1e-9110-dc9f89b17717 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Get console output
Jan 12 13:47:58 compute-0 nova_compute[181978]: 2026-01-12 13:47:58.185 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:48:00 compute-0 nova_compute[181978]: 2026-01-12 13:48:00.658 181991 DEBUG oslo_concurrency.lockutils [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "interface-e21c2b66-4a73-4093-b44b-c47371cf431e-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:00 compute-0 nova_compute[181978]: 2026-01-12 13:48:00.658 181991 DEBUG oslo_concurrency.lockutils [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "interface-e21c2b66-4a73-4093-b44b-c47371cf431e-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:00 compute-0 nova_compute[181978]: 2026-01-12 13:48:00.659 181991 DEBUG nova.objects.instance [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'flavor' on Instance uuid e21c2b66-4a73-4093-b44b-c47371cf431e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:00 compute-0 nova_compute[181978]: 2026-01-12 13:48:00.926 181991 DEBUG nova.objects.instance [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_requests' on Instance uuid e21c2b66-4a73-4093-b44b-c47371cf431e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:00 compute-0 nova_compute[181978]: 2026-01-12 13:48:00.947 181991 DEBUG nova.network.neutron [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:48:01 compute-0 nova_compute[181978]: 2026-01-12 13:48:01.069 181991 DEBUG nova.policy [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:48:01 compute-0 nova_compute[181978]: 2026-01-12 13:48:01.582 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:02 compute-0 nova_compute[181978]: 2026-01-12 13:48:02.202 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:02 compute-0 nova_compute[181978]: 2026-01-12 13:48:02.275 181991 DEBUG nova.network.neutron [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Successfully created port: eafb6dbf-17d8-48eb-b6d5-3942724ec106 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:48:02 compute-0 podman[211976]: 2026-01-12 13:48:02.565401394 +0000 UTC m=+0.062148548 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:48:05 compute-0 nova_compute[181978]: 2026-01-12 13:48:05.219 181991 DEBUG nova.network.neutron [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Successfully updated port: eafb6dbf-17d8-48eb-b6d5-3942724ec106 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:48:05 compute-0 nova_compute[181978]: 2026-01-12 13:48:05.235 181991 DEBUG oslo_concurrency.lockutils [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:48:05 compute-0 nova_compute[181978]: 2026-01-12 13:48:05.235 181991 DEBUG oslo_concurrency.lockutils [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:48:05 compute-0 nova_compute[181978]: 2026-01-12 13:48:05.236 181991 DEBUG nova.network.neutron [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:48:05 compute-0 nova_compute[181978]: 2026-01-12 13:48:05.314 181991 DEBUG nova.compute.manager [req-850633ef-7acd-4d26-a6c4-fd78e4a73567 req-0a3000ae-5706-4d91-8370-bfd5f1f244a2 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-changed-eafb6dbf-17d8-48eb-b6d5-3942724ec106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:05 compute-0 nova_compute[181978]: 2026-01-12 13:48:05.314 181991 DEBUG nova.compute.manager [req-850633ef-7acd-4d26-a6c4-fd78e4a73567 req-0a3000ae-5706-4d91-8370-bfd5f1f244a2 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing instance network info cache due to event network-changed-eafb6dbf-17d8-48eb-b6d5-3942724ec106. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:48:05 compute-0 nova_compute[181978]: 2026-01-12 13:48:05.314 181991 DEBUG oslo_concurrency.lockutils [req-850633ef-7acd-4d26-a6c4-fd78e4a73567 req-0a3000ae-5706-4d91-8370-bfd5f1f244a2 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:48:05 compute-0 podman[211997]: 2026-01-12 13:48:05.539302606 +0000 UTC m=+0.036888315 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 12 13:48:06 compute-0 nova_compute[181978]: 2026-01-12 13:48:06.584 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:07 compute-0 nova_compute[181978]: 2026-01-12 13:48:07.203 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.493 181991 DEBUG nova.network.neutron [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.509 181991 DEBUG oslo_concurrency.lockutils [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.510 181991 DEBUG oslo_concurrency.lockutils [req-850633ef-7acd-4d26-a6c4-fd78e4a73567 req-0a3000ae-5706-4d91-8370-bfd5f1f244a2 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.510 181991 DEBUG nova.network.neutron [req-850633ef-7acd-4d26-a6c4-fd78e4a73567 req-0a3000ae-5706-4d91-8370-bfd5f1f244a2 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing network info cache for port eafb6dbf-17d8-48eb-b6d5-3942724ec106 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.512 181991 DEBUG nova.virt.libvirt.vif [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:47:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1564557670',display_name='tempest-TestNetworkBasicOps-server-1564557670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1564557670',id=6,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF/1bsGLeQnmdFaxag5upZGX8c2nEffySj/4Q7V/vijQLcjUXrGhri7z9WjVl1StDm/8dFJgv2Bx084i0GN8hlc/x3+ywRJcfhYEbagbfonLagAIsEOT3tS68tmGbqIsmw==',key_name='tempest-TestNetworkBasicOps-1099157907',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:47:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-j3tw249e',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:47:42Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=e21c2b66-4a73-4093-b44b-c47371cf431e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.513 181991 DEBUG nova.network.os_vif_util [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.513 181991 DEBUG nova.network.os_vif_util [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.514 181991 DEBUG os_vif [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.514 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.514 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.515 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.517 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.517 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeafb6dbf-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.518 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeafb6dbf-17, col_values=(('external_ids', {'iface-id': 'eafb6dbf-17d8-48eb-b6d5-3942724ec106', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:32:72', 'vm-uuid': 'e21c2b66-4a73-4093-b44b-c47371cf431e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:08 compute-0 NetworkManager[55211]: <info>  [1768225688.5198] manager: (tapeafb6dbf-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.521 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.522 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.525 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.526 181991 INFO os_vif [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17')
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.527 181991 DEBUG nova.virt.libvirt.vif [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:47:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1564557670',display_name='tempest-TestNetworkBasicOps-server-1564557670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1564557670',id=6,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF/1bsGLeQnmdFaxag5upZGX8c2nEffySj/4Q7V/vijQLcjUXrGhri7z9WjVl1StDm/8dFJgv2Bx084i0GN8hlc/x3+ywRJcfhYEbagbfonLagAIsEOT3tS68tmGbqIsmw==',key_name='tempest-TestNetworkBasicOps-1099157907',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:47:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-j3tw249e',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:47:42Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=e21c2b66-4a73-4093-b44b-c47371cf431e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.527 181991 DEBUG nova.network.os_vif_util [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.527 181991 DEBUG nova.network.os_vif_util [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.529 181991 DEBUG nova.virt.libvirt.guest [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] attach device xml: <interface type="ethernet">
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <mac address="fa:16:3e:af:32:72"/>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <model type="virtio"/>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <mtu size="1442"/>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <target dev="tapeafb6dbf-17"/>
Jan 12 13:48:08 compute-0 nova_compute[181978]: </interface>
Jan 12 13:48:08 compute-0 nova_compute[181978]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 12 13:48:08 compute-0 kernel: tapeafb6dbf-17: entered promiscuous mode
Jan 12 13:48:08 compute-0 ovn_controller[94974]: 2026-01-12T13:48:08Z|00101|binding|INFO|Claiming lport eafb6dbf-17d8-48eb-b6d5-3942724ec106 for this chassis.
Jan 12 13:48:08 compute-0 ovn_controller[94974]: 2026-01-12T13:48:08Z|00102|binding|INFO|eafb6dbf-17d8-48eb-b6d5-3942724ec106: Claiming fa:16:3e:af:32:72 10.100.0.20
Jan 12 13:48:08 compute-0 NetworkManager[55211]: <info>  [1768225688.5420] manager: (tapeafb6dbf-17): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.541 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.553 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:32:72 10.100.0.20'], port_security=['fa:16:3e:af:32:72 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'e21c2b66-4a73-4093-b44b-c47371cf431e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e7848670-66d3-47c2-aa04-0080edfddbef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a476fd1-5f2d-4feb-930a-1d269face980, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=eafb6dbf-17d8-48eb-b6d5-3942724ec106) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.554 104189 INFO neutron.agent.ovn.metadata.agent [-] Port eafb6dbf-17d8-48eb-b6d5-3942724ec106 in datapath f937e860-a51b-4e2b-b213-ca4bc16774e1 bound to our chassis
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.555 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f937e860-a51b-4e2b-b213-ca4bc16774e1
Jan 12 13:48:08 compute-0 systemd-udevd[212021]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.564 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[bb260324-b756-40ad-9e63-d1503fee6810]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.564 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf937e860-a1 in ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.566 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf937e860-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.566 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[20b8f245-7861-464c-bd97-7d4103a1a36d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.566 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[afc10f3d-273e-4134-8345-6354f6c175f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 NetworkManager[55211]: <info>  [1768225688.5720] device (tapeafb6dbf-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:48:08 compute-0 NetworkManager[55211]: <info>  [1768225688.5727] device (tapeafb6dbf-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:48:08 compute-0 ovn_controller[94974]: 2026-01-12T13:48:08Z|00103|binding|INFO|Setting lport eafb6dbf-17d8-48eb-b6d5-3942724ec106 ovn-installed in OVS
Jan 12 13:48:08 compute-0 ovn_controller[94974]: 2026-01-12T13:48:08Z|00104|binding|INFO|Setting lport eafb6dbf-17d8-48eb-b6d5-3942724ec106 up in Southbound
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.580 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.583 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[51360018-4b66-458a-8810-17b9ecfda35e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.601 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e228b7d6-b8bf-49b7-8620-d7b7a4f5fa6a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.618 181991 DEBUG nova.virt.libvirt.driver [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.618 181991 DEBUG nova.virt.libvirt.driver [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.618 181991 DEBUG nova.virt.libvirt.driver [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:86:61:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.618 181991 DEBUG nova.virt.libvirt.driver [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:af:32:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.619 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[bd8dd75c-0e01-45d0-b201-96ace6878d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 NetworkManager[55211]: <info>  [1768225688.6244] manager: (tapf937e860-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.623 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e590933d-fd12-4c1c-ad43-2c1631a226ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 systemd-udevd[212023]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.638 181991 DEBUG nova.virt.libvirt.guest [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1564557670</nova:name>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:48:08</nova:creationTime>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:48:08 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     <nova:port uuid="f4b71a00-88ba-4e02-82f2-54866d84d7bd">
Jan 12 13:48:08 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     <nova:port uuid="eafb6dbf-17d8-48eb-b6d5-3942724ec106">
Jan 12 13:48:08 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 12 13:48:08 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:08 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:48:08 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:48:08 compute-0 nova_compute[181978]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.648 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[8c017458-f4ac-4330-a696-3e6543ea6149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.652 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[d9221f40-9c20-4939-a187-610bbdc9b0e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.658 181991 DEBUG oslo_concurrency.lockutils [None req-33680e90-ecad-409e-9877-decbe08eff12 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "interface-e21c2b66-4a73-4093-b44b-c47371cf431e-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.999s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:08 compute-0 NetworkManager[55211]: <info>  [1768225688.6679] device (tapf937e860-a0): carrier: link connected
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.670 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[78645b5b-4b13-4282-b588-f4471a8dc615]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.684 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[6ccdb6c2-0fc1-4fa8-9bcb-44dc4a2eb021]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf937e860-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:2b:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 275469, 'reachable_time': 43110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212039, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.694 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[c6197fe1-d9a6-4d89-975e-4a898536af20]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:2b3a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 275469, 'tstamp': 275469}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212040, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.705 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe2a3fe-6be7-406b-80cb-69b6c0a54c24]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf937e860-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:2b:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 275469, 'reachable_time': 43110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212041, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.724 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[39098d20-0612-42e2-8698-323025b98595]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.760 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[29cd5ea1-fca9-4438-a20c-79f65e3c2b91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.760 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf937e860-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.761 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.761 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf937e860-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.764 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 kernel: tapf937e860-a0: entered promiscuous mode
Jan 12 13:48:08 compute-0 NetworkManager[55211]: <info>  [1768225688.7652] manager: (tapf937e860-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.767 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf937e860-a0, col_values=(('external_ids', {'iface-id': 'a49f1399-8753-45e9-a64a-e65639cdeac7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:08 compute-0 ovn_controller[94974]: 2026-01-12T13:48:08Z|00105|binding|INFO|Releasing lport a49f1399-8753-45e9-a64a-e65639cdeac7 from this chassis (sb_readonly=0)
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.768 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.771 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f937e860-a51b-4e2b-b213-ca4bc16774e1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f937e860-a51b-4e2b-b213-ca4bc16774e1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.771 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8814570f-f9eb-4092-a660-060dc25b96fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.772 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-f937e860-a51b-4e2b-b213-ca4bc16774e1
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/f937e860-a51b-4e2b-b213-ca4bc16774e1.pid.haproxy
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID f937e860-a51b-4e2b-b213-ca4bc16774e1
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:48:08 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:08.773 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'env', 'PROCESS_TAG=haproxy-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f937e860-a51b-4e2b-b213-ca4bc16774e1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.780 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.789 181991 DEBUG nova.compute.manager [req-3e5ef51b-31e0-4a3b-bfe1-ead31cfc5557 req-088dfaf3-929b-44c8-9be4-039b02dbdc0e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-plugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.789 181991 DEBUG oslo_concurrency.lockutils [req-3e5ef51b-31e0-4a3b-bfe1-ead31cfc5557 req-088dfaf3-929b-44c8-9be4-039b02dbdc0e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.790 181991 DEBUG oslo_concurrency.lockutils [req-3e5ef51b-31e0-4a3b-bfe1-ead31cfc5557 req-088dfaf3-929b-44c8-9be4-039b02dbdc0e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.790 181991 DEBUG oslo_concurrency.lockutils [req-3e5ef51b-31e0-4a3b-bfe1-ead31cfc5557 req-088dfaf3-929b-44c8-9be4-039b02dbdc0e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.790 181991 DEBUG nova.compute.manager [req-3e5ef51b-31e0-4a3b-bfe1-ead31cfc5557 req-088dfaf3-929b-44c8-9be4-039b02dbdc0e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] No waiting events found dispatching network-vif-plugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:48:08 compute-0 nova_compute[181978]: 2026-01-12 13:48:08.791 181991 WARNING nova.compute.manager [req-3e5ef51b-31e0-4a3b-bfe1-ead31cfc5557 req-088dfaf3-929b-44c8-9be4-039b02dbdc0e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received unexpected event network-vif-plugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 for instance with vm_state active and task_state None.
Jan 12 13:48:09 compute-0 podman[212070]: 2026-01-12 13:48:09.044252844 +0000 UTC m=+0.030536077 container create df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true)
Jan 12 13:48:09 compute-0 systemd[1]: Started libpod-conmon-df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0.scope.
Jan 12 13:48:09 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:48:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/842558a7ffdefeb2bcf526ce8064799f74f31c41c05d455fb7578c36770b821f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:48:09 compute-0 podman[212070]: 2026-01-12 13:48:09.089001433 +0000 UTC m=+0.075284685 container init df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:48:09 compute-0 podman[212070]: 2026-01-12 13:48:09.093662349 +0000 UTC m=+0.079945592 container start df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 12 13:48:09 compute-0 podman[212070]: 2026-01-12 13:48:09.029388535 +0000 UTC m=+0.015671777 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:48:09 compute-0 neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1[212082]: [NOTICE]   (212086) : New worker (212088) forked
Jan 12 13:48:09 compute-0 neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1[212082]: [NOTICE]   (212086) : Loading success.
Jan 12 13:48:09 compute-0 nova_compute[181978]: 2026-01-12 13:48:09.335 181991 DEBUG nova.network.neutron [req-850633ef-7acd-4d26-a6c4-fd78e4a73567 req-0a3000ae-5706-4d91-8370-bfd5f1f244a2 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updated VIF entry in instance network info cache for port eafb6dbf-17d8-48eb-b6d5-3942724ec106. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:48:09 compute-0 nova_compute[181978]: 2026-01-12 13:48:09.336 181991 DEBUG nova.network.neutron [req-850633ef-7acd-4d26-a6c4-fd78e4a73567 req-0a3000ae-5706-4d91-8370-bfd5f1f244a2 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:09 compute-0 nova_compute[181978]: 2026-01-12 13:48:09.352 181991 DEBUG oslo_concurrency.lockutils [req-850633ef-7acd-4d26-a6c4-fd78e4a73567 req-0a3000ae-5706-4d91-8370-bfd5f1f244a2 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:48:10 compute-0 nova_compute[181978]: 2026-01-12 13:48:10.863 181991 DEBUG nova.compute.manager [req-c55b3191-4f31-4eb2-b7c5-d324dfec876f req-3df17498-5ae4-4d8e-aefc-fd740239869f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-plugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:10 compute-0 nova_compute[181978]: 2026-01-12 13:48:10.864 181991 DEBUG oslo_concurrency.lockutils [req-c55b3191-4f31-4eb2-b7c5-d324dfec876f req-3df17498-5ae4-4d8e-aefc-fd740239869f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:10 compute-0 nova_compute[181978]: 2026-01-12 13:48:10.864 181991 DEBUG oslo_concurrency.lockutils [req-c55b3191-4f31-4eb2-b7c5-d324dfec876f req-3df17498-5ae4-4d8e-aefc-fd740239869f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:10 compute-0 nova_compute[181978]: 2026-01-12 13:48:10.865 181991 DEBUG oslo_concurrency.lockutils [req-c55b3191-4f31-4eb2-b7c5-d324dfec876f req-3df17498-5ae4-4d8e-aefc-fd740239869f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:10 compute-0 nova_compute[181978]: 2026-01-12 13:48:10.865 181991 DEBUG nova.compute.manager [req-c55b3191-4f31-4eb2-b7c5-d324dfec876f req-3df17498-5ae4-4d8e-aefc-fd740239869f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] No waiting events found dispatching network-vif-plugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:48:10 compute-0 nova_compute[181978]: 2026-01-12 13:48:10.865 181991 WARNING nova.compute.manager [req-c55b3191-4f31-4eb2-b7c5-d324dfec876f req-3df17498-5ae4-4d8e-aefc-fd740239869f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received unexpected event network-vif-plugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 for instance with vm_state active and task_state None.
Jan 12 13:48:11 compute-0 nova_compute[181978]: 2026-01-12 13:48:11.233 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:11.234 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:48:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:11.235 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:48:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:11.236 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:11 compute-0 ovn_controller[94974]: 2026-01-12T13:48:11Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:af:32:72 10.100.0.20
Jan 12 13:48:11 compute-0 ovn_controller[94974]: 2026-01-12T13:48:11Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:32:72 10.100.0.20
Jan 12 13:48:12 compute-0 nova_compute[181978]: 2026-01-12 13:48:12.205 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:13 compute-0 nova_compute[181978]: 2026-01-12 13:48:13.520 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.446 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "85b0aac7-4573-4a0a-953f-2061684396fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.447 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.461 181991 DEBUG nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.527 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.528 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.536 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.536 181991 INFO nova.compute.claims [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:48:15 compute-0 podman[212094]: 2026-01-12 13:48:15.558495103 +0000 UTC m=+0.054095049 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:48:15 compute-0 podman[212095]: 2026-01-12 13:48:15.560630719 +0000 UTC m=+0.052957781 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible)
Jan 12 13:48:15 compute-0 podman[212093]: 2026-01-12 13:48:15.604416387 +0000 UTC m=+0.100235086 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.625 181991 DEBUG nova.compute.provider_tree [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.638 181991 DEBUG nova.scheduler.client.report [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.651 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.651 181991 DEBUG nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.683 181991 DEBUG nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.683 181991 DEBUG nova.network.neutron [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.693 181991 INFO nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.706 181991 DEBUG nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.782 181991 DEBUG nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.783 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.783 181991 INFO nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Creating image(s)
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.784 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.784 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.784 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.794 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.814 181991 DEBUG nova.policy [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.840 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.840 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.840 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.849 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.893 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.894 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.915 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.916 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.916 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.960 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.960 181991 DEBUG nova.virt.disk.api [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:48:15 compute-0 nova_compute[181978]: 2026-01-12 13:48:15.961 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:16 compute-0 nova_compute[181978]: 2026-01-12 13:48:16.004 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:16 compute-0 nova_compute[181978]: 2026-01-12 13:48:16.004 181991 DEBUG nova.virt.disk.api [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:48:16 compute-0 nova_compute[181978]: 2026-01-12 13:48:16.005 181991 DEBUG nova.objects.instance [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid 85b0aac7-4573-4a0a-953f-2061684396fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:16 compute-0 nova_compute[181978]: 2026-01-12 13:48:16.083 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:48:16 compute-0 nova_compute[181978]: 2026-01-12 13:48:16.084 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Ensure instance console log exists: /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:48:16 compute-0 nova_compute[181978]: 2026-01-12 13:48:16.084 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:16 compute-0 nova_compute[181978]: 2026-01-12 13:48:16.084 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:16 compute-0 nova_compute[181978]: 2026-01-12 13:48:16.084 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:16 compute-0 nova_compute[181978]: 2026-01-12 13:48:16.487 181991 DEBUG nova.network.neutron [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Successfully created port: 688fc4fa-9056-4474-8ed0-19418c2c9363 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.207 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.217 181991 DEBUG nova.network.neutron [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Successfully updated port: 688fc4fa-9056-4474-8ed0-19418c2c9363 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.232 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-85b0aac7-4573-4a0a-953f-2061684396fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.233 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-85b0aac7-4573-4a0a-953f-2061684396fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.233 181991 DEBUG nova.network.neutron [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.287 181991 DEBUG nova.compute.manager [req-f7a08c6e-0db6-4c86-b534-26a0220ac6ff req-558e83f9-f6aa-4fcb-aca7-ead1719d250e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Received event network-changed-688fc4fa-9056-4474-8ed0-19418c2c9363 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.287 181991 DEBUG nova.compute.manager [req-f7a08c6e-0db6-4c86-b534-26a0220ac6ff req-558e83f9-f6aa-4fcb-aca7-ead1719d250e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Refreshing instance network info cache due to event network-changed-688fc4fa-9056-4474-8ed0-19418c2c9363. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.287 181991 DEBUG oslo_concurrency.lockutils [req-f7a08c6e-0db6-4c86-b534-26a0220ac6ff req-558e83f9-f6aa-4fcb-aca7-ead1719d250e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-85b0aac7-4573-4a0a-953f-2061684396fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.357 181991 DEBUG nova.network.neutron [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.495 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.615 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.615 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquired lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.615 181991 DEBUG nova.network.neutron [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.615 181991 DEBUG nova.objects.instance [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e21c2b66-4a73-4093-b44b-c47371cf431e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.921 181991 DEBUG nova.network.neutron [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Updating instance_info_cache with network_info: [{"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.937 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-85b0aac7-4573-4a0a-953f-2061684396fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.937 181991 DEBUG nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Instance network_info: |[{"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.937 181991 DEBUG oslo_concurrency.lockutils [req-f7a08c6e-0db6-4c86-b534-26a0220ac6ff req-558e83f9-f6aa-4fcb-aca7-ead1719d250e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-85b0aac7-4573-4a0a-953f-2061684396fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.938 181991 DEBUG nova.network.neutron [req-f7a08c6e-0db6-4c86-b534-26a0220ac6ff req-558e83f9-f6aa-4fcb-aca7-ead1719d250e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Refreshing network info cache for port 688fc4fa-9056-4474-8ed0-19418c2c9363 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.939 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Start _get_guest_xml network_info=[{"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.942 181991 WARNING nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.948 181991 DEBUG nova.virt.libvirt.host [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.948 181991 DEBUG nova.virt.libvirt.host [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.950 181991 DEBUG nova.virt.libvirt.host [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.951 181991 DEBUG nova.virt.libvirt.host [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.951 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.951 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.951 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.951 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.952 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.952 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.952 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.952 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.952 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.952 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.952 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.953 181991 DEBUG nova.virt.hardware [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.955 181991 DEBUG nova.virt.libvirt.vif [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:48:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-467271010',display_name='tempest-TestNetworkBasicOps-server-467271010',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-467271010',id=7,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdhme5yxdVVi/6y6uA1cLb5H+87t5eLoVI6OFCmgN5fLgsn41nkQBGL9ZAbGUaL/JUteh6y9urraKbS60A3/njoq8UDdLMTVUNtXBmdqJIoqh456dpDsKDLMuLu5ix74A==',key_name='tempest-TestNetworkBasicOps-428723152',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-258mo8fm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:48:15Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=85b0aac7-4573-4a0a-953f-2061684396fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.955 181991 DEBUG nova.network.os_vif_util [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.955 181991 DEBUG nova.network.os_vif_util [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:02:18,bridge_name='br-int',has_traffic_filtering=True,id=688fc4fa-9056-4474-8ed0-19418c2c9363,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap688fc4fa-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.956 181991 DEBUG nova.objects.instance [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid 85b0aac7-4573-4a0a-953f-2061684396fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.964 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <uuid>85b0aac7-4573-4a0a-953f-2061684396fa</uuid>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <name>instance-00000007</name>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-467271010</nova:name>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:48:17</nova:creationTime>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:48:17 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:48:17 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:48:17 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:48:17 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:48:17 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:48:17 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:48:17 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:48:17 compute-0 nova_compute[181978]:         <nova:port uuid="688fc4fa-9056-4474-8ed0-19418c2c9363">
Jan 12 13:48:17 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.23" ipVersion="4"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <system>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <entry name="serial">85b0aac7-4573-4a0a-953f-2061684396fa</entry>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <entry name="uuid">85b0aac7-4573-4a0a-953f-2061684396fa</entry>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     </system>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <os>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   </os>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <features>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   </features>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk.config"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:48:02:18"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <target dev="tap688fc4fa-90"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/console.log" append="off"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <video>
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     </video>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:48:17 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:48:17 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:48:17 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:48:17 compute-0 nova_compute[181978]: </domain>
Jan 12 13:48:17 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.964 181991 DEBUG nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Preparing to wait for external event network-vif-plugged-688fc4fa-9056-4474-8ed0-19418c2c9363 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.965 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.965 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.965 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.965 181991 DEBUG nova.virt.libvirt.vif [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:48:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-467271010',display_name='tempest-TestNetworkBasicOps-server-467271010',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-467271010',id=7,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdhme5yxdVVi/6y6uA1cLb5H+87t5eLoVI6OFCmgN5fLgsn41nkQBGL9ZAbGUaL/JUteh6y9urraKbS60A3/njoq8UDdLMTVUNtXBmdqJIoqh456dpDsKDLMuLu5ix74A==',key_name='tempest-TestNetworkBasicOps-428723152',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-258mo8fm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:48:15Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=85b0aac7-4573-4a0a-953f-2061684396fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.966 181991 DEBUG nova.network.os_vif_util [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.966 181991 DEBUG nova.network.os_vif_util [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:02:18,bridge_name='br-int',has_traffic_filtering=True,id=688fc4fa-9056-4474-8ed0-19418c2c9363,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap688fc4fa-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.966 181991 DEBUG os_vif [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:02:18,bridge_name='br-int',has_traffic_filtering=True,id=688fc4fa-9056-4474-8ed0-19418c2c9363,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap688fc4fa-90') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.966 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.967 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.967 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.969 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.969 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap688fc4fa-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.969 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap688fc4fa-90, col_values=(('external_ids', {'iface-id': '688fc4fa-9056-4474-8ed0-19418c2c9363', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:02:18', 'vm-uuid': '85b0aac7-4573-4a0a-953f-2061684396fa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:17 compute-0 NetworkManager[55211]: <info>  [1768225697.9707] manager: (tap688fc4fa-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.970 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.972 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.974 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:17 compute-0 nova_compute[181978]: 2026-01-12 13:48:17.975 181991 INFO os_vif [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:02:18,bridge_name='br-int',has_traffic_filtering=True,id=688fc4fa-9056-4474-8ed0-19418c2c9363,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap688fc4fa-90')
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.009 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.010 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.010 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:48:02:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.010 181991 INFO nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Using config drive
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.214 181991 INFO nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Creating config drive at /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk.config
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.218 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj7awl86r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.337 181991 DEBUG oslo_concurrency.processutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj7awl86r" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:18 compute-0 kernel: tap688fc4fa-90: entered promiscuous mode
Jan 12 13:48:18 compute-0 NetworkManager[55211]: <info>  [1768225698.3704] manager: (tap688fc4fa-90): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Jan 12 13:48:18 compute-0 ovn_controller[94974]: 2026-01-12T13:48:18Z|00106|binding|INFO|Claiming lport 688fc4fa-9056-4474-8ed0-19418c2c9363 for this chassis.
Jan 12 13:48:18 compute-0 ovn_controller[94974]: 2026-01-12T13:48:18Z|00107|binding|INFO|688fc4fa-9056-4474-8ed0-19418c2c9363: Claiming fa:16:3e:48:02:18 10.100.0.23
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.372 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.386 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:02:18 10.100.0.23'], port_security=['fa:16:3e:48:02:18 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '85b0aac7-4573-4a0a-953f-2061684396fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '96b7d845-912a-4fc2-aeac-6a1f199bdeca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a476fd1-5f2d-4feb-930a-1d269face980, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=688fc4fa-9056-4474-8ed0-19418c2c9363) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.387 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 688fc4fa-9056-4474-8ed0-19418c2c9363 in datapath f937e860-a51b-4e2b-b213-ca4bc16774e1 bound to our chassis
Jan 12 13:48:18 compute-0 ovn_controller[94974]: 2026-01-12T13:48:18Z|00108|binding|INFO|Setting lport 688fc4fa-9056-4474-8ed0-19418c2c9363 ovn-installed in OVS
Jan 12 13:48:18 compute-0 ovn_controller[94974]: 2026-01-12T13:48:18Z|00109|binding|INFO|Setting lport 688fc4fa-9056-4474-8ed0-19418c2c9363 up in Southbound
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.388 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f937e860-a51b-4e2b-b213-ca4bc16774e1
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.388 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.393 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.399 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[a2bf890f-a176-42cc-b523-50a34c5b9c7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:18 compute-0 systemd-machined[153581]: New machine qemu-7-instance-00000007.
Jan 12 13:48:18 compute-0 systemd-udevd[212194]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:48:18 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Jan 12 13:48:18 compute-0 NetworkManager[55211]: <info>  [1768225698.4209] device (tap688fc4fa-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:48:18 compute-0 NetworkManager[55211]: <info>  [1768225698.4216] device (tap688fc4fa-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.419 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd6f44c-2b3d-4432-b03d-3439a1b8e5c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.424 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[618c79fa-834e-483a-b59f-f96061f5d60e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.440 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d57e7f-c65b-40d6-938a-aa12f86ea2d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.450 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5ccbf2-08a3-403d-8dd1-041b8da32f6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf937e860-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:2b:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 275469, 'reachable_time': 43110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212202, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.459 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[430552a2-71bb-4ad6-8cc7-2328b07edd6c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf937e860-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 275476, 'tstamp': 275476}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212205, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapf937e860-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 275478, 'tstamp': 275478}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212205, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.461 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf937e860-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.462 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.465 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf937e860-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.465 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.465 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf937e860-a0, col_values=(('external_ids', {'iface-id': 'a49f1399-8753-45e9-a64a-e65639cdeac7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:18.465 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.555 181991 DEBUG nova.compute.manager [req-8286de28-2322-487c-bd5a-8ab6761c276b req-ebcea2d4-679b-410c-88a4-fd0391bb0abc 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Received event network-vif-plugged-688fc4fa-9056-4474-8ed0-19418c2c9363 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.555 181991 DEBUG oslo_concurrency.lockutils [req-8286de28-2322-487c-bd5a-8ab6761c276b req-ebcea2d4-679b-410c-88a4-fd0391bb0abc 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.556 181991 DEBUG oslo_concurrency.lockutils [req-8286de28-2322-487c-bd5a-8ab6761c276b req-ebcea2d4-679b-410c-88a4-fd0391bb0abc 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.556 181991 DEBUG oslo_concurrency.lockutils [req-8286de28-2322-487c-bd5a-8ab6761c276b req-ebcea2d4-679b-410c-88a4-fd0391bb0abc 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.556 181991 DEBUG nova.compute.manager [req-8286de28-2322-487c-bd5a-8ab6761c276b req-ebcea2d4-679b-410c-88a4-fd0391bb0abc 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Processing event network-vif-plugged-688fc4fa-9056-4474-8ed0-19418c2c9363 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.820 181991 DEBUG nova.network.neutron [req-f7a08c6e-0db6-4c86-b534-26a0220ac6ff req-558e83f9-f6aa-4fcb-aca7-ead1719d250e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Updated VIF entry in instance network info cache for port 688fc4fa-9056-4474-8ed0-19418c2c9363. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.821 181991 DEBUG nova.network.neutron [req-f7a08c6e-0db6-4c86-b534-26a0220ac6ff req-558e83f9-f6aa-4fcb-aca7-ead1719d250e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Updating instance_info_cache with network_info: [{"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.831 181991 DEBUG nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.832 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225698.8314335, 85b0aac7-4573-4a0a-953f-2061684396fa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.832 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] VM Started (Lifecycle Event)
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.835 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.837 181991 DEBUG oslo_concurrency.lockutils [req-f7a08c6e-0db6-4c86-b534-26a0220ac6ff req-558e83f9-f6aa-4fcb-aca7-ead1719d250e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-85b0aac7-4573-4a0a-953f-2061684396fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.838 181991 INFO nova.virt.libvirt.driver [-] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Instance spawned successfully.
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.838 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.848 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.853 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.853 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.854 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.854 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.855 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.855 181991 DEBUG nova.virt.libvirt.driver [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.857 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.885 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.885 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225698.8319755, 85b0aac7-4573-4a0a-953f-2061684396fa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.885 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] VM Paused (Lifecycle Event)
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.903 181991 INFO nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Took 3.12 seconds to spawn the instance on the hypervisor.
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.903 181991 DEBUG nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.904 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.908 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225698.8342612, 85b0aac7-4573-4a0a-953f-2061684396fa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.909 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] VM Resumed (Lifecycle Event)
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.927 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.928 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.945 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.953 181991 INFO nova.compute.manager [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Took 3.45 seconds to build instance.
Jan 12 13:48:18 compute-0 nova_compute[181978]: 2026-01-12 13:48:18.963 181991 DEBUG oslo_concurrency.lockutils [None req-f5a98f7f-1c66-45bf-adc3-3f3eea1b26ea d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.516s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:19 compute-0 nova_compute[181978]: 2026-01-12 13:48:19.223 181991 DEBUG nova.network.neutron [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:19 compute-0 nova_compute[181978]: 2026-01-12 13:48:19.241 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Releasing lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:48:19 compute-0 nova_compute[181978]: 2026-01-12 13:48:19.241 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 12 13:48:19 compute-0 nova_compute[181978]: 2026-01-12 13:48:19.242 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:19 compute-0 nova_compute[181978]: 2026-01-12 13:48:19.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.505 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.505 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.505 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.506 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.555 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.613 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.614 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.650 181991 DEBUG nova.compute.manager [req-9b2a5079-9605-4eeb-9021-3f6c408400da req-48868697-757b-41c9-8010-a41095ca50d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Received event network-vif-plugged-688fc4fa-9056-4474-8ed0-19418c2c9363 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.651 181991 DEBUG oslo_concurrency.lockutils [req-9b2a5079-9605-4eeb-9021-3f6c408400da req-48868697-757b-41c9-8010-a41095ca50d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.652 181991 DEBUG oslo_concurrency.lockutils [req-9b2a5079-9605-4eeb-9021-3f6c408400da req-48868697-757b-41c9-8010-a41095ca50d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.652 181991 DEBUG oslo_concurrency.lockutils [req-9b2a5079-9605-4eeb-9021-3f6c408400da req-48868697-757b-41c9-8010-a41095ca50d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.652 181991 DEBUG nova.compute.manager [req-9b2a5079-9605-4eeb-9021-3f6c408400da req-48868697-757b-41c9-8010-a41095ca50d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] No waiting events found dispatching network-vif-plugged-688fc4fa-9056-4474-8ed0-19418c2c9363 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.653 181991 WARNING nova.compute.manager [req-9b2a5079-9605-4eeb-9021-3f6c408400da req-48868697-757b-41c9-8010-a41095ca50d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Received unexpected event network-vif-plugged-688fc4fa-9056-4474-8ed0-19418c2c9363 for instance with vm_state active and task_state None.
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.672 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.678 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.734 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.736 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:48:20 compute-0 nova_compute[181978]: 2026-01-12 13:48:20.793 181991 DEBUG oslo_concurrency.processutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.036 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.037 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5489MB free_disk=73.3507194519043GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.037 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.037 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.091 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Instance e21c2b66-4a73-4093-b44b-c47371cf431e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.091 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Instance 85b0aac7-4573-4a0a-953f-2061684396fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.091 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.092 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.136 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.148 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.161 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:48:21 compute-0 nova_compute[181978]: 2026-01-12 13:48:21.161 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:22 compute-0 nova_compute[181978]: 2026-01-12 13:48:22.157 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:22 compute-0 nova_compute[181978]: 2026-01-12 13:48:22.158 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:22 compute-0 nova_compute[181978]: 2026-01-12 13:48:22.174 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:22 compute-0 nova_compute[181978]: 2026-01-12 13:48:22.209 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:22 compute-0 podman[212227]: 2026-01-12 13:48:22.545488802 +0000 UTC m=+0.040882427 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 12 13:48:22 compute-0 nova_compute[181978]: 2026-01-12 13:48:22.971 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:23 compute-0 nova_compute[181978]: 2026-01-12 13:48:23.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:48:23 compute-0 nova_compute[181978]: 2026-01-12 13:48:23.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:48:27 compute-0 nova_compute[181978]: 2026-01-12 13:48:27.211 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:27 compute-0 nova_compute[181978]: 2026-01-12 13:48:27.972 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:29 compute-0 ovn_controller[94974]: 2026-01-12T13:48:29Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:48:02:18 10.100.0.23
Jan 12 13:48:29 compute-0 ovn_controller[94974]: 2026-01-12T13:48:29Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:02:18 10.100.0.23
Jan 12 13:48:32 compute-0 nova_compute[181978]: 2026-01-12 13:48:32.212 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:32 compute-0 nova_compute[181978]: 2026-01-12 13:48:32.973 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:33 compute-0 podman[212253]: 2026-01-12 13:48:33.54358777 +0000 UTC m=+0.037752230 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 12 13:48:36 compute-0 podman[212274]: 2026-01-12 13:48:36.547379315 +0000 UTC m=+0.042752917 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 12 13:48:37 compute-0 nova_compute[181978]: 2026-01-12 13:48:37.214 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:37 compute-0 nova_compute[181978]: 2026-01-12 13:48:37.508 181991 DEBUG nova.compute.manager [req-886afde0-4de6-4f34-b283-f0667e668c42 req-565d1ebf-ccda-4fd4-8345-289acce792ab 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-changed-eafb6dbf-17d8-48eb-b6d5-3942724ec106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:37 compute-0 nova_compute[181978]: 2026-01-12 13:48:37.508 181991 DEBUG nova.compute.manager [req-886afde0-4de6-4f34-b283-f0667e668c42 req-565d1ebf-ccda-4fd4-8345-289acce792ab 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing instance network info cache due to event network-changed-eafb6dbf-17d8-48eb-b6d5-3942724ec106. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:48:37 compute-0 nova_compute[181978]: 2026-01-12 13:48:37.509 181991 DEBUG oslo_concurrency.lockutils [req-886afde0-4de6-4f34-b283-f0667e668c42 req-565d1ebf-ccda-4fd4-8345-289acce792ab 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:48:37 compute-0 nova_compute[181978]: 2026-01-12 13:48:37.509 181991 DEBUG oslo_concurrency.lockutils [req-886afde0-4de6-4f34-b283-f0667e668c42 req-565d1ebf-ccda-4fd4-8345-289acce792ab 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:48:37 compute-0 nova_compute[181978]: 2026-01-12 13:48:37.509 181991 DEBUG nova.network.neutron [req-886afde0-4de6-4f34-b283-f0667e668c42 req-565d1ebf-ccda-4fd4-8345-289acce792ab 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing network info cache for port eafb6dbf-17d8-48eb-b6d5-3942724ec106 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:48:37 compute-0 nova_compute[181978]: 2026-01-12 13:48:37.974 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:38 compute-0 nova_compute[181978]: 2026-01-12 13:48:38.596 181991 DEBUG nova.network.neutron [req-886afde0-4de6-4f34-b283-f0667e668c42 req-565d1ebf-ccda-4fd4-8345-289acce792ab 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updated VIF entry in instance network info cache for port eafb6dbf-17d8-48eb-b6d5-3942724ec106. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:48:38 compute-0 nova_compute[181978]: 2026-01-12 13:48:38.597 181991 DEBUG nova.network.neutron [req-886afde0-4de6-4f34-b283-f0667e668c42 req-565d1ebf-ccda-4fd4-8345-289acce792ab 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:38 compute-0 nova_compute[181978]: 2026-01-12 13:48:38.616 181991 DEBUG oslo_concurrency.lockutils [req-886afde0-4de6-4f34-b283-f0667e668c42 req-565d1ebf-ccda-4fd4-8345-289acce792ab 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:48:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:40.201 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:40.202 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:40.202 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:42 compute-0 nova_compute[181978]: 2026-01-12 13:48:42.215 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:42 compute-0 nova_compute[181978]: 2026-01-12 13:48:42.977 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:43 compute-0 nova_compute[181978]: 2026-01-12 13:48:43.954 181991 DEBUG oslo_concurrency.lockutils [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "85b0aac7-4573-4a0a-953f-2061684396fa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:43 compute-0 nova_compute[181978]: 2026-01-12 13:48:43.955 181991 DEBUG oslo_concurrency.lockutils [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:43 compute-0 nova_compute[181978]: 2026-01-12 13:48:43.955 181991 DEBUG oslo_concurrency.lockutils [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:43 compute-0 nova_compute[181978]: 2026-01-12 13:48:43.955 181991 DEBUG oslo_concurrency.lockutils [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:43 compute-0 nova_compute[181978]: 2026-01-12 13:48:43.955 181991 DEBUG oslo_concurrency.lockutils [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:43 compute-0 nova_compute[181978]: 2026-01-12 13:48:43.956 181991 INFO nova.compute.manager [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Terminating instance
Jan 12 13:48:43 compute-0 nova_compute[181978]: 2026-01-12 13:48:43.957 181991 DEBUG nova.compute.manager [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:48:43 compute-0 kernel: tap688fc4fa-90 (unregistering): left promiscuous mode
Jan 12 13:48:43 compute-0 NetworkManager[55211]: <info>  [1768225723.9833] device (tap688fc4fa-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:48:43 compute-0 ovn_controller[94974]: 2026-01-12T13:48:43Z|00110|binding|INFO|Releasing lport 688fc4fa-9056-4474-8ed0-19418c2c9363 from this chassis (sb_readonly=0)
Jan 12 13:48:43 compute-0 ovn_controller[94974]: 2026-01-12T13:48:43Z|00111|binding|INFO|Setting lport 688fc4fa-9056-4474-8ed0-19418c2c9363 down in Southbound
Jan 12 13:48:43 compute-0 ovn_controller[94974]: 2026-01-12T13:48:43Z|00112|binding|INFO|Removing iface tap688fc4fa-90 ovn-installed in OVS
Jan 12 13:48:43 compute-0 nova_compute[181978]: 2026-01-12 13:48:43.990 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:43 compute-0 nova_compute[181978]: 2026-01-12 13:48:43.993 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:43.998 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:02:18 10.100.0.23'], port_security=['fa:16:3e:48:02:18 10.100.0.23'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.23/28', 'neutron:device_id': '85b0aac7-4573-4a0a-953f-2061684396fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '96b7d845-912a-4fc2-aeac-6a1f199bdeca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a476fd1-5f2d-4feb-930a-1d269face980, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=688fc4fa-9056-4474-8ed0-19418c2c9363) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:43.999 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 688fc4fa-9056-4474-8ed0-19418c2c9363 in datapath f937e860-a51b-4e2b-b213-ca4bc16774e1 unbound from our chassis
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:43.999 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f937e860-a51b-4e2b-b213-ca4bc16774e1
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.006 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.013 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1a90cd42-c073-407c-9fa6-0f69d7e4b044]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.033 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc1f805-58b8-452d-ba9d-530027c14a9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.035 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[e6de39fe-4f02-430d-99fd-fcdf6c52240e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:44 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 12 13:48:44 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 10.594s CPU time.
Jan 12 13:48:44 compute-0 systemd-machined[153581]: Machine qemu-7-instance-00000007 terminated.
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.056 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[113650a1-0706-4de7-8c47-f0ad398b6dd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.066 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[67cdc9a8-52ef-4b17-8528-c33afe3f5a54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf937e860-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:2b:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 880, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 880, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 275469, 'reachable_time': 43110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 600, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 600, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212302, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.075 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[341d75f3-dce0-4047-9d81-abbd5ad5abda]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf937e860-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 275476, 'tstamp': 275476}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212303, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapf937e860-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 275478, 'tstamp': 275478}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212303, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.076 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf937e860-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.077 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.080 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.080 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf937e860-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.081 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.081 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf937e860-a0, col_values=(('external_ids', {'iface-id': 'a49f1399-8753-45e9-a64a-e65639cdeac7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:44 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:44.081 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.169 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.172 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.195 181991 INFO nova.virt.libvirt.driver [-] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Instance destroyed successfully.
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.195 181991 DEBUG nova.objects.instance [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid 85b0aac7-4573-4a0a-953f-2061684396fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.205 181991 DEBUG nova.virt.libvirt.vif [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:48:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-467271010',display_name='tempest-TestNetworkBasicOps-server-467271010',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-467271010',id=7,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdhme5yxdVVi/6y6uA1cLb5H+87t5eLoVI6OFCmgN5fLgsn41nkQBGL9ZAbGUaL/JUteh6y9urraKbS60A3/njoq8UDdLMTVUNtXBmdqJIoqh456dpDsKDLMuLu5ix74A==',key_name='tempest-TestNetworkBasicOps-428723152',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:48:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-258mo8fm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:48:18Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=85b0aac7-4573-4a0a-953f-2061684396fa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.206 181991 DEBUG nova.network.os_vif_util [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "688fc4fa-9056-4474-8ed0-19418c2c9363", "address": "fa:16:3e:48:02:18", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.23", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap688fc4fa-90", "ovs_interfaceid": "688fc4fa-9056-4474-8ed0-19418c2c9363", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.206 181991 DEBUG nova.network.os_vif_util [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:02:18,bridge_name='br-int',has_traffic_filtering=True,id=688fc4fa-9056-4474-8ed0-19418c2c9363,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap688fc4fa-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.206 181991 DEBUG os_vif [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:02:18,bridge_name='br-int',has_traffic_filtering=True,id=688fc4fa-9056-4474-8ed0-19418c2c9363,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap688fc4fa-90') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.208 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.208 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap688fc4fa-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.209 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.210 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.212 181991 INFO os_vif [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:02:18,bridge_name='br-int',has_traffic_filtering=True,id=688fc4fa-9056-4474-8ed0-19418c2c9363,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap688fc4fa-90')
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.212 181991 INFO nova.virt.libvirt.driver [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Deleting instance files /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa_del
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.213 181991 INFO nova.virt.libvirt.driver [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Deletion of /var/lib/nova/instances/85b0aac7-4573-4a0a-953f-2061684396fa_del complete
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.243 181991 INFO nova.compute.manager [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Took 0.29 seconds to destroy the instance on the hypervisor.
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.243 181991 DEBUG oslo.service.loopingcall [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.243 181991 DEBUG nova.compute.manager [-] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.243 181991 DEBUG nova.network.neutron [-] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.454 181991 DEBUG nova.compute.manager [req-7d5b3b7f-978f-46cc-a34e-26ed2f365312 req-d5ce27e2-e70d-4787-978a-c4350ae688ec 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Received event network-vif-unplugged-688fc4fa-9056-4474-8ed0-19418c2c9363 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.455 181991 DEBUG oslo_concurrency.lockutils [req-7d5b3b7f-978f-46cc-a34e-26ed2f365312 req-d5ce27e2-e70d-4787-978a-c4350ae688ec 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.455 181991 DEBUG oslo_concurrency.lockutils [req-7d5b3b7f-978f-46cc-a34e-26ed2f365312 req-d5ce27e2-e70d-4787-978a-c4350ae688ec 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.455 181991 DEBUG oslo_concurrency.lockutils [req-7d5b3b7f-978f-46cc-a34e-26ed2f365312 req-d5ce27e2-e70d-4787-978a-c4350ae688ec 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.455 181991 DEBUG nova.compute.manager [req-7d5b3b7f-978f-46cc-a34e-26ed2f365312 req-d5ce27e2-e70d-4787-978a-c4350ae688ec 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] No waiting events found dispatching network-vif-unplugged-688fc4fa-9056-4474-8ed0-19418c2c9363 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.455 181991 DEBUG nova.compute.manager [req-7d5b3b7f-978f-46cc-a34e-26ed2f365312 req-d5ce27e2-e70d-4787-978a-c4350ae688ec 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Received event network-vif-unplugged-688fc4fa-9056-4474-8ed0-19418c2c9363 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.728 181991 DEBUG nova.network.neutron [-] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.741 181991 INFO nova.compute.manager [-] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Took 0.50 seconds to deallocate network for instance.
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.767 181991 DEBUG oslo_concurrency.lockutils [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.768 181991 DEBUG oslo_concurrency.lockutils [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.784 181991 DEBUG nova.compute.manager [req-aac7d0f3-12a8-452c-9145-bf1584f09a97 req-2deb95b7-83d3-48eb-b0f5-43c114f424d2 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Received event network-vif-deleted-688fc4fa-9056-4474-8ed0-19418c2c9363 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.832 181991 DEBUG nova.compute.provider_tree [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.840 181991 DEBUG nova.scheduler.client.report [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.852 181991 DEBUG oslo_concurrency.lockutils [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.873 181991 INFO nova.scheduler.client.report [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance 85b0aac7-4573-4a0a-953f-2061684396fa
Jan 12 13:48:44 compute-0 nova_compute[181978]: 2026-01-12 13:48:44.917 181991 DEBUG oslo_concurrency.lockutils [None req-45e70cd4-8633-4264-81dd-a11ba19edd4e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.519 181991 DEBUG oslo_concurrency.lockutils [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "interface-e21c2b66-4a73-4093-b44b-c47371cf431e-eafb6dbf-17d8-48eb-b6d5-3942724ec106" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.520 181991 DEBUG oslo_concurrency.lockutils [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "interface-e21c2b66-4a73-4093-b44b-c47371cf431e-eafb6dbf-17d8-48eb-b6d5-3942724ec106" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.529 181991 DEBUG nova.objects.instance [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'flavor' on Instance uuid e21c2b66-4a73-4093-b44b-c47371cf431e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.532 181991 DEBUG nova.compute.manager [req-7951dd91-4e7a-4771-82f4-1671e3b7fd48 req-d88d7545-60cd-43c3-9992-d9342ea9ba82 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Received event network-vif-plugged-688fc4fa-9056-4474-8ed0-19418c2c9363 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.532 181991 DEBUG oslo_concurrency.lockutils [req-7951dd91-4e7a-4771-82f4-1671e3b7fd48 req-d88d7545-60cd-43c3-9992-d9342ea9ba82 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.533 181991 DEBUG oslo_concurrency.lockutils [req-7951dd91-4e7a-4771-82f4-1671e3b7fd48 req-d88d7545-60cd-43c3-9992-d9342ea9ba82 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.533 181991 DEBUG oslo_concurrency.lockutils [req-7951dd91-4e7a-4771-82f4-1671e3b7fd48 req-d88d7545-60cd-43c3-9992-d9342ea9ba82 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "85b0aac7-4573-4a0a-953f-2061684396fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.533 181991 DEBUG nova.compute.manager [req-7951dd91-4e7a-4771-82f4-1671e3b7fd48 req-d88d7545-60cd-43c3-9992-d9342ea9ba82 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] No waiting events found dispatching network-vif-plugged-688fc4fa-9056-4474-8ed0-19418c2c9363 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.533 181991 WARNING nova.compute.manager [req-7951dd91-4e7a-4771-82f4-1671e3b7fd48 req-d88d7545-60cd-43c3-9992-d9342ea9ba82 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Received unexpected event network-vif-plugged-688fc4fa-9056-4474-8ed0-19418c2c9363 for instance with vm_state deleted and task_state None.
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.548 181991 DEBUG nova.virt.libvirt.vif [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:47:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1564557670',display_name='tempest-TestNetworkBasicOps-server-1564557670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1564557670',id=6,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF/1bsGLeQnmdFaxag5upZGX8c2nEffySj/4Q7V/vijQLcjUXrGhri7z9WjVl1StDm/8dFJgv2Bx084i0GN8hlc/x3+ywRJcfhYEbagbfonLagAIsEOT3tS68tmGbqIsmw==',key_name='tempest-TestNetworkBasicOps-1099157907',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:47:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-j3tw249e',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:47:42Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=e21c2b66-4a73-4093-b44b-c47371cf431e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.548 181991 DEBUG nova.network.os_vif_util [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.550 181991 DEBUG nova.network.os_vif_util [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.556 181991 DEBUG nova.virt.libvirt.guest [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.557 181991 DEBUG nova.virt.libvirt.guest [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.559 181991 DEBUG nova.virt.libvirt.driver [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Attempting to detach device tapeafb6dbf-17 from instance e21c2b66-4a73-4093-b44b-c47371cf431e from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.559 181991 DEBUG nova.virt.libvirt.guest [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] detach device xml: <interface type="ethernet">
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <mac address="fa:16:3e:af:32:72"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <model type="virtio"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <mtu size="1442"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <target dev="tapeafb6dbf-17"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]: </interface>
Jan 12 13:48:46 compute-0 nova_compute[181978]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 12 13:48:46 compute-0 podman[212323]: 2026-01-12 13:48:46.562505831 +0000 UTC m=+0.051175230 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=)
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.562 181991 DEBUG nova.virt.libvirt.guest [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:48:46 compute-0 podman[212322]: 2026-01-12 13:48:46.564988007 +0000 UTC m=+0.056483239 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.569 181991 DEBUG nova.virt.libvirt.guest [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface>not found in domain: <domain type='kvm' id='6'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <name>instance-00000006</name>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <uuid>e21c2b66-4a73-4093-b44b-c47371cf431e</uuid>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1564557670</nova:name>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:48:08</nova:creationTime>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:port uuid="f4b71a00-88ba-4e02-82f2-54866d84d7bd">
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:port uuid="eafb6dbf-17d8-48eb-b6d5-3942724ec106">
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:48:46 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <memory unit='KiB'>131072</memory>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <vcpu placement='static'>1</vcpu>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <resource>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <partition>/machine</partition>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </resource>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <sysinfo type='smbios'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <system>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='manufacturer'>RDO</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='product'>OpenStack Compute</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='serial'>e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='uuid'>e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='family'>Virtual Machine</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </system>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <os>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <boot dev='hd'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <smbios mode='sysinfo'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </os>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <features>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <vmcoreinfo state='on'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </features>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <cpu mode='custom' match='exact' check='full'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <vendor>AMD</vendor>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='x2apic'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc-deadline'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='hypervisor'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc_adjust'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='vaes'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='spec-ctrl'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='stibp'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='ssbd'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='cmp_legacy'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='overflow-recov'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='succor'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='virt-ssbd'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='lbrv'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='tsc-scale'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='vmcb-clean'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='flushbyasid'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='pause-filter'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='pfthreshold'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='v-vmsave-vmload'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='vgif'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='svm'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='topoext'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='npt'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='nrip-save'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <clock offset='utc'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <timer name='pit' tickpolicy='delay'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <timer name='hpet' present='no'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <on_poweroff>destroy</on_poweroff>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <on_reboot>restart</on_reboot>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <on_crash>destroy</on_crash>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <disk type='file' device='disk'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk' index='2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <backingStore type='file' index='3'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:         <format type='raw'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:         <source file='/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:         <backingStore/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       </backingStore>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target dev='vda' bus='virtio'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='virtio-disk0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <disk type='file' device='cdrom'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <driver name='qemu' type='raw' cache='none'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.config' index='1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <backingStore/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target dev='sda' bus='sata'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <readonly/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='sata0-0-0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='0' model='pcie-root'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pcie.0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='1' port='0x10'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='2' port='0x11'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='3' port='0x12'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.3'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='4' port='0x13'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.4'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='5' port='0x14'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.5'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='6' port='0x15'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.6'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='7' port='0x16'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.7'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='8' port='0x17'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.8'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='9' port='0x18'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.9'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='10' port='0x19'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.10'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='11' port='0x1a'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.11'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='12' port='0x1b'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.12'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='13' port='0x1c'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.13'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='14' port='0x1d'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.14'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='15' port='0x1e'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.15'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='16' port='0x1f'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.16'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='17' port='0x20'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.17'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='18' port='0x21'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.18'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='19' port='0x22'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.19'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='20' port='0x23'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.20'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='21' port='0x24'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.21'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='22' port='0x25'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.22'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='23' port='0x26'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.23'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='24' port='0x27'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.24'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='25' port='0x28'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.25'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-pci-bridge'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.26'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='usb'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='sata' index='0'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='ide'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:86:61:fe'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target dev='tapf4b71a00-88'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='net0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:af:32:72'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target dev='tapeafb6dbf-17'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='net1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <serial type='pty'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log' append='off'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target type='isa-serial' port='0'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:         <model name='isa-serial'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       </target>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <console type='pty' tty='/dev/pts/0'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log' append='off'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target type='serial' port='0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </console>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <input type='tablet' bus='usb'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='input0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='usb' bus='0' port='1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <input type='mouse' bus='ps2'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='input1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <input type='keyboard' bus='ps2'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='input2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <listen type='address' address='::0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <audio id='1' type='none'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <video>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model type='virtio' heads='1' primary='yes'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='video0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </video>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <watchdog model='itco' action='reset'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='watchdog0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </watchdog>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <memballoon model='virtio'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <stats period='10'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='balloon0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <rng model='virtio'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <backend model='random'>/dev/urandom</backend>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='rng0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <label>system_u:system_r:svirt_t:s0:c424,c1010</label>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c424,c1010</imagelabel>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <label>+107:+107</label>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <imagelabel>+107:+107</imagelabel>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:48:46 compute-0 nova_compute[181978]: </domain>
Jan 12 13:48:46 compute-0 nova_compute[181978]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.569 181991 INFO nova.virt.libvirt.driver [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully detached device tapeafb6dbf-17 from instance e21c2b66-4a73-4093-b44b-c47371cf431e from the persistent domain config.
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.569 181991 DEBUG nova.virt.libvirt.driver [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] (1/8): Attempting to detach device tapeafb6dbf-17 with device alias net1 from instance e21c2b66-4a73-4093-b44b-c47371cf431e from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.569 181991 DEBUG nova.virt.libvirt.guest [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] detach device xml: <interface type="ethernet">
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <mac address="fa:16:3e:af:32:72"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <model type="virtio"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <mtu size="1442"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <target dev="tapeafb6dbf-17"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]: </interface>
Jan 12 13:48:46 compute-0 nova_compute[181978]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 12 13:48:46 compute-0 podman[212321]: 2026-01-12 13:48:46.583741734 +0000 UTC m=+0.076715576 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 12 13:48:46 compute-0 kernel: tapeafb6dbf-17 (unregistering): left promiscuous mode
Jan 12 13:48:46 compute-0 NetworkManager[55211]: <info>  [1768225726.6576] device (tapeafb6dbf-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:48:46 compute-0 ovn_controller[94974]: 2026-01-12T13:48:46Z|00113|binding|INFO|Releasing lport eafb6dbf-17d8-48eb-b6d5-3942724ec106 from this chassis (sb_readonly=0)
Jan 12 13:48:46 compute-0 ovn_controller[94974]: 2026-01-12T13:48:46Z|00114|binding|INFO|Setting lport eafb6dbf-17d8-48eb-b6d5-3942724ec106 down in Southbound
Jan 12 13:48:46 compute-0 ovn_controller[94974]: 2026-01-12T13:48:46Z|00115|binding|INFO|Removing iface tapeafb6dbf-17 ovn-installed in OVS
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.664 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.665 181991 DEBUG nova.virt.libvirt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Received event <DeviceRemovedEvent: 1768225726.6656868, e21c2b66-4a73-4093-b44b-c47371cf431e => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.667 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:32:72 10.100.0.20', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'e21c2b66-4a73-4093-b44b-c47371cf431e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a476fd1-5f2d-4feb-930a-1d269face980, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=eafb6dbf-17d8-48eb-b6d5-3942724ec106) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.668 181991 DEBUG nova.virt.libvirt.driver [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Start waiting for the detach event from libvirt for device tapeafb6dbf-17 with device alias net1 for instance e21c2b66-4a73-4093-b44b-c47371cf431e _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.668 181991 DEBUG nova.virt.libvirt.guest [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.668 104189 INFO neutron.agent.ovn.metadata.agent [-] Port eafb6dbf-17d8-48eb-b6d5-3942724ec106 in datapath f937e860-a51b-4e2b-b213-ca4bc16774e1 unbound from our chassis
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.669 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f937e860-a51b-4e2b-b213-ca4bc16774e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.670 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[12550680-b88c-4545-80c6-d859b69b3882]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.670 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1 namespace which is not needed anymore
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.671 181991 DEBUG nova.virt.libvirt.guest [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface>not found in domain: <domain type='kvm' id='6'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <name>instance-00000006</name>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <uuid>e21c2b66-4a73-4093-b44b-c47371cf431e</uuid>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1564557670</nova:name>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:48:08</nova:creationTime>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:port uuid="f4b71a00-88ba-4e02-82f2-54866d84d7bd">
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:port uuid="eafb6dbf-17d8-48eb-b6d5-3942724ec106">
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:48:46 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <memory unit='KiB'>131072</memory>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <vcpu placement='static'>1</vcpu>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <resource>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <partition>/machine</partition>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </resource>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <sysinfo type='smbios'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <system>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='manufacturer'>RDO</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='product'>OpenStack Compute</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='serial'>e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='uuid'>e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <entry name='family'>Virtual Machine</entry>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </system>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <os>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <boot dev='hd'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <smbios mode='sysinfo'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </os>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <features>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <vmcoreinfo state='on'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </features>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <cpu mode='custom' match='exact' check='full'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <vendor>AMD</vendor>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='x2apic'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc-deadline'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='hypervisor'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc_adjust'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='vaes'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='spec-ctrl'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='stibp'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='ssbd'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='cmp_legacy'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='overflow-recov'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='succor'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='virt-ssbd'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='lbrv'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='tsc-scale'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='vmcb-clean'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='flushbyasid'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='pause-filter'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='pfthreshold'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='v-vmsave-vmload'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='vgif'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='svm'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='require' name='topoext'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='npt'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='nrip-save'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <clock offset='utc'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <timer name='pit' tickpolicy='delay'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <timer name='hpet' present='no'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <on_poweroff>destroy</on_poweroff>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <on_reboot>restart</on_reboot>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <on_crash>destroy</on_crash>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <disk type='file' device='disk'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk' index='2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <backingStore type='file' index='3'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:         <format type='raw'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:         <source file='/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:         <backingStore/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       </backingStore>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target dev='vda' bus='virtio'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='virtio-disk0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <disk type='file' device='cdrom'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <driver name='qemu' type='raw' cache='none'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.config' index='1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <backingStore/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target dev='sda' bus='sata'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <readonly/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='sata0-0-0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='0' model='pcie-root'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pcie.0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='1' port='0x10'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='2' port='0x11'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='3' port='0x12'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.3'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='4' port='0x13'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.4'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='5' port='0x14'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.5'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='6' port='0x15'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.6'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='7' port='0x16'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.7'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='8' port='0x17'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.8'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='9' port='0x18'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.9'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='10' port='0x19'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.10'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='11' port='0x1a'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.11'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='12' port='0x1b'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.12'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='13' port='0x1c'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.13'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='14' port='0x1d'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.14'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='15' port='0x1e'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.15'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='16' port='0x1f'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.16'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='17' port='0x20'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.17'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='18' port='0x21'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.18'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='19' port='0x22'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.19'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='20' port='0x23'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.20'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='21' port='0x24'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.21'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='22' port='0x25'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.22'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='23' port='0x26'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.23'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='24' port='0x27'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.24'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target chassis='25' port='0x28'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.25'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model name='pcie-pci-bridge'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='pci.26'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='usb'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <controller type='sata' index='0'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='ide'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:86:61:fe'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target dev='tapf4b71a00-88'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='net0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <serial type='pty'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log' append='off'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target type='isa-serial' port='0'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:         <model name='isa-serial'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       </target>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <console type='pty' tty='/dev/pts/0'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log' append='off'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <target type='serial' port='0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </console>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <input type='tablet' bus='usb'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='input0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='usb' bus='0' port='1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <input type='mouse' bus='ps2'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='input1'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <input type='keyboard' bus='ps2'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='input2'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <listen type='address' address='::0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <audio id='1' type='none'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <video>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <model type='virtio' heads='1' primary='yes'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='video0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </video>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <watchdog model='itco' action='reset'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='watchdog0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </watchdog>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <memballoon model='virtio'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <stats period='10'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='balloon0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <rng model='virtio'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <backend model='random'>/dev/urandom</backend>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <alias name='rng0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <label>system_u:system_r:svirt_t:s0:c424,c1010</label>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c424,c1010</imagelabel>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <label>+107:+107</label>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <imagelabel>+107:+107</imagelabel>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:48:46 compute-0 nova_compute[181978]: </domain>
Jan 12 13:48:46 compute-0 nova_compute[181978]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.671 181991 INFO nova.virt.libvirt.driver [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully detached device tapeafb6dbf-17 from instance e21c2b66-4a73-4093-b44b-c47371cf431e from the live domain config.
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.671 181991 DEBUG nova.virt.libvirt.vif [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:47:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1564557670',display_name='tempest-TestNetworkBasicOps-server-1564557670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1564557670',id=6,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF/1bsGLeQnmdFaxag5upZGX8c2nEffySj/4Q7V/vijQLcjUXrGhri7z9WjVl1StDm/8dFJgv2Bx084i0GN8hlc/x3+ywRJcfhYEbagbfonLagAIsEOT3tS68tmGbqIsmw==',key_name='tempest-TestNetworkBasicOps-1099157907',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:47:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-j3tw249e',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:47:42Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=e21c2b66-4a73-4093-b44b-c47371cf431e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.671 181991 DEBUG nova.network.os_vif_util [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.672 181991 DEBUG nova.network.os_vif_util [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.672 181991 DEBUG os_vif [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.673 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.673 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeafb6dbf-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.679 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.681 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.682 181991 INFO os_vif [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17')
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.683 181991 DEBUG nova.virt.libvirt.guest [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1564557670</nova:name>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:48:46</nova:creationTime>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     <nova:port uuid="f4b71a00-88ba-4e02-82f2-54866d84d7bd">
Jan 12 13:48:46 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 12 13:48:46 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:46 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:48:46 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:48:46 compute-0 nova_compute[181978]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 12 13:48:46 compute-0 neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1[212082]: [NOTICE]   (212086) : haproxy version is 2.8.14-c23fe91
Jan 12 13:48:46 compute-0 neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1[212082]: [NOTICE]   (212086) : path to executable is /usr/sbin/haproxy
Jan 12 13:48:46 compute-0 neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1[212082]: [ALERT]    (212086) : Current worker (212088) exited with code 143 (Terminated)
Jan 12 13:48:46 compute-0 neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1[212082]: [WARNING]  (212086) : All workers exited. Exiting... (0)
Jan 12 13:48:46 compute-0 systemd[1]: libpod-df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0.scope: Deactivated successfully.
Jan 12 13:48:46 compute-0 podman[212401]: 2026-01-12 13:48:46.763532423 +0000 UTC m=+0.033849959 container died df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 12 13:48:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0-userdata-shm.mount: Deactivated successfully.
Jan 12 13:48:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-842558a7ffdefeb2bcf526ce8064799f74f31c41c05d455fb7578c36770b821f-merged.mount: Deactivated successfully.
Jan 12 13:48:46 compute-0 podman[212401]: 2026-01-12 13:48:46.798359808 +0000 UTC m=+0.068677326 container cleanup df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:48:46 compute-0 systemd[1]: libpod-conmon-df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0.scope: Deactivated successfully.
Jan 12 13:48:46 compute-0 podman[212425]: 2026-01-12 13:48:46.84020156 +0000 UTC m=+0.024352284 container remove df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.843 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d9954034-73f7-4a95-9bd2-3dae526530f4]: (4, ('Mon Jan 12 01:48:46 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1 (df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0)\ndf9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0\nMon Jan 12 01:48:46 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1 (df9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0)\ndf9c76e66820863708234707e52fc8576280f3ffbb04a881ecbc23c1ec1af9a0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.844 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ea980553-788c-418a-8005-1d66de14d316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.845 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf937e860-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:46 compute-0 kernel: tapf937e860-a0: left promiscuous mode
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.848 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:46 compute-0 nova_compute[181978]: 2026-01-12 13:48:46.860 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.861 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[60df1314-397c-41da-8a0a-727cc05438ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.876 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[f281fbc3-843c-49d7-9351-7338d2ae1312]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.876 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[f3abcf97-fd47-4221-9c99-4a042b894e70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.887 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ef9ca396-f0b3-4fc7-956d-61b55afbcd38]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 275464, 'reachable_time': 19148, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212437, 'error': None, 'target': 'ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:46 compute-0 systemd[1]: run-netns-ovnmeta\x2df937e860\x2da51b\x2d4e2b\x2db213\x2dca4bc16774e1.mount: Deactivated successfully.
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.890 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f937e860-a51b-4e2b-b213-ca4bc16774e1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:48:46 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:46.890 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[24298ebf-1eee-456d-af14-6b4a7080765b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.217 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.537 181991 DEBUG oslo_concurrency.lockutils [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.537 181991 DEBUG oslo_concurrency.lockutils [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.537 181991 DEBUG nova.network.neutron [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.564 181991 DEBUG nova.compute.manager [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-deleted-eafb6dbf-17d8-48eb-b6d5-3942724ec106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.565 181991 INFO nova.compute.manager [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Neutron deleted interface eafb6dbf-17d8-48eb-b6d5-3942724ec106; detaching it from the instance and deleting it from the info cache
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.565 181991 DEBUG nova.network.neutron [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.578 181991 DEBUG nova.objects.instance [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lazy-loading 'system_metadata' on Instance uuid e21c2b66-4a73-4093-b44b-c47371cf431e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.597 181991 DEBUG nova.objects.instance [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lazy-loading 'flavor' on Instance uuid e21c2b66-4a73-4093-b44b-c47371cf431e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.611 181991 DEBUG nova.virt.libvirt.vif [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:47:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1564557670',display_name='tempest-TestNetworkBasicOps-server-1564557670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1564557670',id=6,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF/1bsGLeQnmdFaxag5upZGX8c2nEffySj/4Q7V/vijQLcjUXrGhri7z9WjVl1StDm/8dFJgv2Bx084i0GN8hlc/x3+ywRJcfhYEbagbfonLagAIsEOT3tS68tmGbqIsmw==',key_name='tempest-TestNetworkBasicOps-1099157907',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:47:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-j3tw249e',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:47:42Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=e21c2b66-4a73-4093-b44b-c47371cf431e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.611 181991 DEBUG nova.network.os_vif_util [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Converting VIF {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.612 181991 DEBUG nova.network.os_vif_util [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.614 181991 DEBUG nova.virt.libvirt.guest [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.616 181991 DEBUG nova.virt.libvirt.guest [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface>not found in domain: <domain type='kvm' id='6'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <name>instance-00000006</name>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <uuid>e21c2b66-4a73-4093-b44b-c47371cf431e</uuid>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1564557670</nova:name>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:48:46</nova:creationTime>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:port uuid="f4b71a00-88ba-4e02-82f2-54866d84d7bd">
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:48:47 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <memory unit='KiB'>131072</memory>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <vcpu placement='static'>1</vcpu>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <resource>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <partition>/machine</partition>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </resource>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <sysinfo type='smbios'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <system>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='manufacturer'>RDO</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='product'>OpenStack Compute</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='serial'>e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='uuid'>e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='family'>Virtual Machine</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </system>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <os>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <boot dev='hd'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <smbios mode='sysinfo'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </os>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <features>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <vmcoreinfo state='on'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </features>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <cpu mode='custom' match='exact' check='full'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <vendor>AMD</vendor>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='x2apic'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc-deadline'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='hypervisor'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc_adjust'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='vaes'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='spec-ctrl'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='stibp'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='ssbd'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='cmp_legacy'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='overflow-recov'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='succor'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='virt-ssbd'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='lbrv'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='tsc-scale'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='vmcb-clean'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='flushbyasid'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='pause-filter'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='pfthreshold'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='v-vmsave-vmload'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='vgif'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='svm'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='topoext'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='npt'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='nrip-save'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <clock offset='utc'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <timer name='pit' tickpolicy='delay'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <timer name='hpet' present='no'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <on_poweroff>destroy</on_poweroff>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <on_reboot>restart</on_reboot>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <on_crash>destroy</on_crash>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <disk type='file' device='disk'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk' index='2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <backingStore type='file' index='3'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:         <format type='raw'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:         <source file='/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:         <backingStore/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       </backingStore>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target dev='vda' bus='virtio'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='virtio-disk0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <disk type='file' device='cdrom'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <driver name='qemu' type='raw' cache='none'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.config' index='1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <backingStore/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target dev='sda' bus='sata'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <readonly/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='sata0-0-0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='0' model='pcie-root'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pcie.0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='1' port='0x10'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='2' port='0x11'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='3' port='0x12'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.3'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='4' port='0x13'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.4'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='5' port='0x14'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.5'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='6' port='0x15'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.6'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='7' port='0x16'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.7'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='8' port='0x17'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.8'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='9' port='0x18'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.9'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='10' port='0x19'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.10'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='11' port='0x1a'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.11'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='12' port='0x1b'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.12'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='13' port='0x1c'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.13'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='14' port='0x1d'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.14'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='15' port='0x1e'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.15'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='16' port='0x1f'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.16'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='17' port='0x20'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.17'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='18' port='0x21'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.18'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='19' port='0x22'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.19'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='20' port='0x23'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.20'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='21' port='0x24'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.21'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='22' port='0x25'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.22'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='23' port='0x26'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.23'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='24' port='0x27'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.24'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='25' port='0x28'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.25'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-pci-bridge'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.26'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='usb'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='sata' index='0'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='ide'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:86:61:fe'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target dev='tapf4b71a00-88'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='net0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <serial type='pty'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log' append='off'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target type='isa-serial' port='0'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:         <model name='isa-serial'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       </target>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <console type='pty' tty='/dev/pts/0'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log' append='off'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target type='serial' port='0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </console>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <input type='tablet' bus='usb'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='input0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='usb' bus='0' port='1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <input type='mouse' bus='ps2'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='input1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <input type='keyboard' bus='ps2'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='input2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <listen type='address' address='::0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <audio id='1' type='none'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <video>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model type='virtio' heads='1' primary='yes'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='video0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </video>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <watchdog model='itco' action='reset'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='watchdog0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </watchdog>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <memballoon model='virtio'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <stats period='10'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='balloon0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <rng model='virtio'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <backend model='random'>/dev/urandom</backend>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='rng0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <label>system_u:system_r:svirt_t:s0:c424,c1010</label>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c424,c1010</imagelabel>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <label>+107:+107</label>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <imagelabel>+107:+107</imagelabel>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:48:47 compute-0 nova_compute[181978]: </domain>
Jan 12 13:48:47 compute-0 nova_compute[181978]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.616 181991 DEBUG nova.virt.libvirt.guest [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.619 181991 DEBUG nova.virt.libvirt.guest [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:af:32:72"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeafb6dbf-17"/></interface>not found in domain: <domain type='kvm' id='6'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <name>instance-00000006</name>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <uuid>e21c2b66-4a73-4093-b44b-c47371cf431e</uuid>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1564557670</nova:name>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:48:46</nova:creationTime>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:port uuid="f4b71a00-88ba-4e02-82f2-54866d84d7bd">
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:48:47 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <memory unit='KiB'>131072</memory>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <vcpu placement='static'>1</vcpu>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <resource>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <partition>/machine</partition>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </resource>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <sysinfo type='smbios'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <system>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='manufacturer'>RDO</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='product'>OpenStack Compute</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='serial'>e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='uuid'>e21c2b66-4a73-4093-b44b-c47371cf431e</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <entry name='family'>Virtual Machine</entry>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </system>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <os>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <boot dev='hd'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <smbios mode='sysinfo'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </os>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <features>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <vmcoreinfo state='on'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </features>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <cpu mode='custom' match='exact' check='full'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <model fallback='forbid'>EPYC-Milan</model>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <vendor>AMD</vendor>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='x2apic'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc-deadline'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='hypervisor'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='tsc_adjust'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='vaes'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='vpclmulqdq'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='spec-ctrl'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='stibp'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='ssbd'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='cmp_legacy'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='overflow-recov'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='succor'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='virt-ssbd'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='lbrv'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='tsc-scale'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='vmcb-clean'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='flushbyasid'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='pause-filter'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='pfthreshold'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='v-vmsave-vmload'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='vgif'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='svm'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='require' name='topoext'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='npt'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='nrip-save'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <clock offset='utc'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <timer name='pit' tickpolicy='delay'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <timer name='hpet' present='no'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <on_poweroff>destroy</on_poweroff>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <on_reboot>restart</on_reboot>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <on_crash>destroy</on_crash>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <disk type='file' device='disk'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk' index='2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <backingStore type='file' index='3'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:         <format type='raw'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:         <source file='/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:         <backingStore/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       </backingStore>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target dev='vda' bus='virtio'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='virtio-disk0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <disk type='file' device='cdrom'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <driver name='qemu' type='raw' cache='none'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <source file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/disk.config' index='1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <backingStore/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target dev='sda' bus='sata'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <readonly/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='sata0-0-0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='0' model='pcie-root'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pcie.0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='1' port='0x10'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='2' port='0x11'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='3' port='0x12'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.3'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='4' port='0x13'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.4'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='5' port='0x14'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.5'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='6' port='0x15'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.6'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='7' port='0x16'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.7'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='8' port='0x17'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.8'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='9' port='0x18'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.9'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='10' port='0x19'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.10'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='11' port='0x1a'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.11'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='12' port='0x1b'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.12'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='13' port='0x1c'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.13'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='14' port='0x1d'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.14'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='15' port='0x1e'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.15'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='16' port='0x1f'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.16'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='17' port='0x20'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.17'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='18' port='0x21'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.18'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='19' port='0x22'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.19'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='20' port='0x23'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.20'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='21' port='0x24'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.21'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='22' port='0x25'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.22'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='23' port='0x26'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.23'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='24' port='0x27'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.24'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-root-port'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target chassis='25' port='0x28'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.25'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model name='pcie-pci-bridge'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='pci.26'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='usb'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <controller type='sata' index='0'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='ide'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </controller>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <interface type='ethernet'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <mac address='fa:16:3e:86:61:fe'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target dev='tapf4b71a00-88'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model type='virtio'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <driver name='vhost' rx_queue_size='512'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <mtu size='1442'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='net0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <serial type='pty'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log' append='off'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target type='isa-serial' port='0'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:         <model name='isa-serial'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       </target>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <console type='pty' tty='/dev/pts/0'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <source path='/dev/pts/0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <log file='/var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e/console.log' append='off'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <target type='serial' port='0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='serial0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </console>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <input type='tablet' bus='usb'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='input0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='usb' bus='0' port='1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <input type='mouse' bus='ps2'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='input1'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <input type='keyboard' bus='ps2'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='input2'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </input>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <listen type='address' address='::0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </graphics>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <audio id='1' type='none'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <video>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <model type='virtio' heads='1' primary='yes'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='video0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </video>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <watchdog model='itco' action='reset'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='watchdog0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </watchdog>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <memballoon model='virtio'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <stats period='10'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='balloon0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <rng model='virtio'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <backend model='random'>/dev/urandom</backend>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <alias name='rng0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <label>system_u:system_r:svirt_t:s0:c424,c1010</label>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c424,c1010</imagelabel>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <label>+107:+107</label>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <imagelabel>+107:+107</imagelabel>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </seclabel>
Jan 12 13:48:47 compute-0 nova_compute[181978]: </domain>
Jan 12 13:48:47 compute-0 nova_compute[181978]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.619 181991 WARNING nova.virt.libvirt.driver [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Detaching interface fa:16:3e:af:32:72 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapeafb6dbf-17' not found.
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.620 181991 DEBUG nova.virt.libvirt.vif [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:47:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1564557670',display_name='tempest-TestNetworkBasicOps-server-1564557670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1564557670',id=6,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF/1bsGLeQnmdFaxag5upZGX8c2nEffySj/4Q7V/vijQLcjUXrGhri7z9WjVl1StDm/8dFJgv2Bx084i0GN8hlc/x3+ywRJcfhYEbagbfonLagAIsEOT3tS68tmGbqIsmw==',key_name='tempest-TestNetworkBasicOps-1099157907',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:47:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-j3tw249e',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:47:42Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=e21c2b66-4a73-4093-b44b-c47371cf431e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.620 181991 DEBUG nova.network.os_vif_util [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Converting VIF {"id": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "address": "fa:16:3e:af:32:72", "network": {"id": "f937e860-a51b-4e2b-b213-ca4bc16774e1", "bridge": "br-int", "label": "tempest-network-smoke--126016530", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeafb6dbf-17", "ovs_interfaceid": "eafb6dbf-17d8-48eb-b6d5-3942724ec106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.620 181991 DEBUG nova.network.os_vif_util [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.621 181991 DEBUG os_vif [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.621 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.622 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeafb6dbf-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.622 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.623 181991 INFO os_vif [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:32:72,bridge_name='br-int',has_traffic_filtering=True,id=eafb6dbf-17d8-48eb-b6d5-3942724ec106,network=Network(f937e860-a51b-4e2b-b213-ca4bc16774e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeafb6dbf-17')
Jan 12 13:48:47 compute-0 nova_compute[181978]: 2026-01-12 13:48:47.623 181991 DEBUG nova.virt.libvirt.guest [req-4b0ddd1e-b920-4c41-b8df-bd8da3c7033e req-f975d81a-6eeb-48ef-9e19-b0e593df0625 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:name>tempest-TestNetworkBasicOps-server-1564557670</nova:name>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:creationTime>2026-01-12 13:48:47</nova:creationTime>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:flavor name="m1.nano">
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:memory>128</nova:memory>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:disk>1</nova:disk>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:swap>0</nova:swap>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:vcpus>1</nova:vcpus>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </nova:flavor>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:owner>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </nova:owner>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   <nova:ports>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     <nova:port uuid="f4b71a00-88ba-4e02-82f2-54866d84d7bd">
Jan 12 13:48:47 compute-0 nova_compute[181978]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 12 13:48:47 compute-0 nova_compute[181978]:     </nova:port>
Jan 12 13:48:47 compute-0 nova_compute[181978]:   </nova:ports>
Jan 12 13:48:47 compute-0 nova_compute[181978]: </nova:instance>
Jan 12 13:48:47 compute-0 nova_compute[181978]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.661 181991 DEBUG nova.compute.manager [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-unplugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.661 181991 DEBUG oslo_concurrency.lockutils [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.662 181991 DEBUG oslo_concurrency.lockutils [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.662 181991 DEBUG oslo_concurrency.lockutils [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.662 181991 DEBUG nova.compute.manager [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] No waiting events found dispatching network-vif-unplugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.662 181991 WARNING nova.compute.manager [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received unexpected event network-vif-unplugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 for instance with vm_state active and task_state None.
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.662 181991 DEBUG nova.compute.manager [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-plugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.662 181991 DEBUG oslo_concurrency.lockutils [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.663 181991 DEBUG oslo_concurrency.lockutils [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.663 181991 DEBUG oslo_concurrency.lockutils [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.663 181991 DEBUG nova.compute.manager [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] No waiting events found dispatching network-vif-plugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.663 181991 WARNING nova.compute.manager [req-24614f5a-0554-4db9-9a38-8f9e0364e221 req-e539b031-5417-407c-b2c9-641e30848a1c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received unexpected event network-vif-plugged-eafb6dbf-17d8-48eb-b6d5-3942724ec106 for instance with vm_state active and task_state None.
Jan 12 13:48:48 compute-0 ovn_controller[94974]: 2026-01-12T13:48:48Z|00116|binding|INFO|Releasing lport c493e866-7a68-4689-83e5-56bf74dbaba7 from this chassis (sb_readonly=0)
Jan 12 13:48:48 compute-0 nova_compute[181978]: 2026-01-12 13:48:48.966 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.310 181991 INFO nova.network.neutron [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Port eafb6dbf-17d8-48eb-b6d5-3942724ec106 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.311 181991 DEBUG nova.network.neutron [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [{"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.323 181991 DEBUG oslo_concurrency.lockutils [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.338 181991 DEBUG oslo_concurrency.lockutils [None req-c2ab824a-b3a6-4da3-9eb9-5ec76f9b36db d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "interface-e21c2b66-4a73-4093-b44b-c47371cf431e-eafb6dbf-17d8-48eb-b6d5-3942724ec106" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.811 181991 DEBUG oslo_concurrency.lockutils [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.812 181991 DEBUG oslo_concurrency.lockutils [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.812 181991 DEBUG oslo_concurrency.lockutils [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.812 181991 DEBUG oslo_concurrency.lockutils [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.812 181991 DEBUG oslo_concurrency.lockutils [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.813 181991 INFO nova.compute.manager [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Terminating instance
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.813 181991 DEBUG nova.compute.manager [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:48:49 compute-0 kernel: tapf4b71a00-88 (unregistering): left promiscuous mode
Jan 12 13:48:49 compute-0 NetworkManager[55211]: <info>  [1768225729.8367] device (tapf4b71a00-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:48:49 compute-0 ovn_controller[94974]: 2026-01-12T13:48:49Z|00117|binding|INFO|Releasing lport f4b71a00-88ba-4e02-82f2-54866d84d7bd from this chassis (sb_readonly=0)
Jan 12 13:48:49 compute-0 ovn_controller[94974]: 2026-01-12T13:48:49Z|00118|binding|INFO|Setting lport f4b71a00-88ba-4e02-82f2-54866d84d7bd down in Southbound
Jan 12 13:48:49 compute-0 ovn_controller[94974]: 2026-01-12T13:48:49Z|00119|binding|INFO|Removing iface tapf4b71a00-88 ovn-installed in OVS
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.842 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:49.854 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:61:fe 10.100.0.14'], port_security=['fa:16:3e:86:61:fe 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e21c2b66-4a73-4093-b44b-c47371cf431e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-956086f6-7f4d-41c7-b756-f2665bee9e93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ac7c355d-263e-491c-8fb3-a7c4644a1471', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14e3122d-a14b-402a-a089-41556bf5f4e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=f4b71a00-88ba-4e02-82f2-54866d84d7bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:48:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:49.855 104189 INFO neutron.agent.ovn.metadata.agent [-] Port f4b71a00-88ba-4e02-82f2-54866d84d7bd in datapath 956086f6-7f4d-41c7-b756-f2665bee9e93 unbound from our chassis
Jan 12 13:48:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:49.856 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 956086f6-7f4d-41c7-b756-f2665bee9e93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:48:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:49.857 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[48ec2c84-738f-4747-b3c1-84e5f376e834]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:49.857 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93 namespace which is not needed anymore
Jan 12 13:48:49 compute-0 nova_compute[181978]: 2026-01-12 13:48:49.857 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:49 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 12 13:48:49 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 11.449s CPU time.
Jan 12 13:48:49 compute-0 systemd-machined[153581]: Machine qemu-6-instance-00000006 terminated.
Jan 12 13:48:49 compute-0 neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93[211872]: [NOTICE]   (211876) : haproxy version is 2.8.14-c23fe91
Jan 12 13:48:49 compute-0 neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93[211872]: [NOTICE]   (211876) : path to executable is /usr/sbin/haproxy
Jan 12 13:48:49 compute-0 neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93[211872]: [ALERT]    (211876) : Current worker (211878) exited with code 143 (Terminated)
Jan 12 13:48:49 compute-0 neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93[211872]: [WARNING]  (211876) : All workers exited. Exiting... (0)
Jan 12 13:48:49 compute-0 systemd[1]: libpod-2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331.scope: Deactivated successfully.
Jan 12 13:48:49 compute-0 conmon[211872]: conmon 2dd17c6e9406792099bd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331.scope/container/memory.events
Jan 12 13:48:49 compute-0 podman[212459]: 2026-01-12 13:48:49.944764147 +0000 UTC m=+0.031697492 container died 2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 12 13:48:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331-userdata-shm.mount: Deactivated successfully.
Jan 12 13:48:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-5ac7b29713008e981bae8a0d73708d026ace941b38808da007f3c5b1e61a4978-merged.mount: Deactivated successfully.
Jan 12 13:48:49 compute-0 podman[212459]: 2026-01-12 13:48:49.963027973 +0000 UTC m=+0.049961318 container cleanup 2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 12 13:48:49 compute-0 systemd[1]: libpod-conmon-2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331.scope: Deactivated successfully.
Jan 12 13:48:50 compute-0 podman[212482]: 2026-01-12 13:48:50.00162161 +0000 UTC m=+0.024182104 container remove 2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 12 13:48:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:50.005 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1fdc4e-0377-4611-b935-91bfa3f5626d]: (4, ('Mon Jan 12 01:48:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93 (2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331)\n2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331\nMon Jan 12 01:48:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93 (2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331)\n2dd17c6e9406792099bd77b4c7e2fcbf9210979b6a2535c553497d8c1f39e331\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:50.007 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3699ef99-2a11-459f-af5a-f782273a11fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:50.007 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap956086f6-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:50 compute-0 kernel: tap956086f6-70: left promiscuous mode
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.010 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.025 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:50.027 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[4d5aeeb9-c5c3-4b6f-b8d5-77b257995cd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:50.036 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1cea3c64-bfcd-4999-8721-e752e803479b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:50.036 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[9790141b-8782-4488-8ded-79716aa16692]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.048 181991 INFO nova.virt.libvirt.driver [-] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Instance destroyed successfully.
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.048 181991 DEBUG nova.objects.instance [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid e21c2b66-4a73-4093-b44b-c47371cf431e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:48:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:50.049 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8b5420-4811-440c-bdc9-4cfc1b42d4b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 272853, 'reachable_time': 21277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212507, 'error': None, 'target': 'ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:50.050 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-956086f6-7f4d-41c7-b756-f2665bee9e93 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:48:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:48:50.050 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[027bd850-95fb-4ea2-ad2c-f834e5a49b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:48:50 compute-0 systemd[1]: run-netns-ovnmeta\x2d956086f6\x2d7f4d\x2d41c7\x2db756\x2df2665bee9e93.mount: Deactivated successfully.
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.061 181991 DEBUG nova.virt.libvirt.vif [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:47:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1564557670',display_name='tempest-TestNetworkBasicOps-server-1564557670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1564557670',id=6,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF/1bsGLeQnmdFaxag5upZGX8c2nEffySj/4Q7V/vijQLcjUXrGhri7z9WjVl1StDm/8dFJgv2Bx084i0GN8hlc/x3+ywRJcfhYEbagbfonLagAIsEOT3tS68tmGbqIsmw==',key_name='tempest-TestNetworkBasicOps-1099157907',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:47:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-j3tw249e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:47:42Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=e21c2b66-4a73-4093-b44b-c47371cf431e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.061 181991 DEBUG nova.network.os_vif_util [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "address": "fa:16:3e:86:61:fe", "network": {"id": "956086f6-7f4d-41c7-b756-f2665bee9e93", "bridge": "br-int", "label": "tempest-network-smoke--1061508229", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4b71a00-88", "ovs_interfaceid": "f4b71a00-88ba-4e02-82f2-54866d84d7bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.062 181991 DEBUG nova.network.os_vif_util [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:61:fe,bridge_name='br-int',has_traffic_filtering=True,id=f4b71a00-88ba-4e02-82f2-54866d84d7bd,network=Network(956086f6-7f4d-41c7-b756-f2665bee9e93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4b71a00-88') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.062 181991 DEBUG os_vif [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:61:fe,bridge_name='br-int',has_traffic_filtering=True,id=f4b71a00-88ba-4e02-82f2-54866d84d7bd,network=Network(956086f6-7f4d-41c7-b756-f2665bee9e93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4b71a00-88') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.063 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.063 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4b71a00-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.064 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.065 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.066 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.067 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.068 181991 INFO os_vif [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:61:fe,bridge_name='br-int',has_traffic_filtering=True,id=f4b71a00-88ba-4e02-82f2-54866d84d7bd,network=Network(956086f6-7f4d-41c7-b756-f2665bee9e93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4b71a00-88')
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.069 181991 INFO nova.virt.libvirt.driver [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Deleting instance files /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e_del
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.069 181991 INFO nova.virt.libvirt.driver [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Deletion of /var/lib/nova/instances/e21c2b66-4a73-4093-b44b-c47371cf431e_del complete
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.101 181991 INFO nova.compute.manager [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Took 0.29 seconds to destroy the instance on the hypervisor.
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.101 181991 DEBUG oslo.service.loopingcall [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.102 181991 DEBUG nova.compute.manager [-] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.102 181991 DEBUG nova.network.neutron [-] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.627 181991 DEBUG nova.network.neutron [-] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.639 181991 INFO nova.compute.manager [-] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Took 0.54 seconds to deallocate network for instance.
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.674 181991 DEBUG oslo_concurrency.lockutils [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.675 181991 DEBUG oslo_concurrency.lockutils [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.678 181991 DEBUG nova.compute.manager [req-ff591ddb-1fbe-475e-bb14-becc7a14084a req-6022828b-d18f-4709-9616-725aed6b30ba 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-deleted-f4b71a00-88ba-4e02-82f2-54866d84d7bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.714 181991 DEBUG nova.compute.provider_tree [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.724 181991 DEBUG nova.scheduler.client.report [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.737 181991 DEBUG nova.compute.manager [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-changed-f4b71a00-88ba-4e02-82f2-54866d84d7bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.737 181991 DEBUG nova.compute.manager [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing instance network info cache due to event network-changed-f4b71a00-88ba-4e02-82f2-54866d84d7bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.737 181991 DEBUG oslo_concurrency.lockutils [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.737 181991 DEBUG oslo_concurrency.lockutils [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.737 181991 DEBUG nova.network.neutron [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Refreshing network info cache for port f4b71a00-88ba-4e02-82f2-54866d84d7bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.739 181991 DEBUG oslo_concurrency.lockutils [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.760 181991 INFO nova.scheduler.client.report [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance e21c2b66-4a73-4093-b44b-c47371cf431e
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.804 181991 DEBUG oslo_concurrency.lockutils [None req-ccbcb2ab-be21-4d0e-adeb-c02da783b740 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:50 compute-0 nova_compute[181978]: 2026-01-12 13:48:50.831 181991 DEBUG nova.network.neutron [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.247 181991 DEBUG nova.network.neutron [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.262 181991 DEBUG oslo_concurrency.lockutils [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-e21c2b66-4a73-4093-b44b-c47371cf431e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.263 181991 DEBUG nova.compute.manager [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-unplugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.263 181991 DEBUG oslo_concurrency.lockutils [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.263 181991 DEBUG oslo_concurrency.lockutils [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.263 181991 DEBUG oslo_concurrency.lockutils [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.263 181991 DEBUG nova.compute.manager [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] No waiting events found dispatching network-vif-unplugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.263 181991 WARNING nova.compute.manager [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received unexpected event network-vif-unplugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd for instance with vm_state deleted and task_state None.
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.264 181991 DEBUG nova.compute.manager [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received event network-vif-plugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.264 181991 DEBUG oslo_concurrency.lockutils [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.264 181991 DEBUG oslo_concurrency.lockutils [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.264 181991 DEBUG oslo_concurrency.lockutils [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "e21c2b66-4a73-4093-b44b-c47371cf431e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.264 181991 DEBUG nova.compute.manager [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] No waiting events found dispatching network-vif-plugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:48:51 compute-0 nova_compute[181978]: 2026-01-12 13:48:51.264 181991 WARNING nova.compute.manager [req-81f21496-714e-41ee-a91e-4a63c9b671cb req-73df58fd-d1b4-4d50-8c50-92138538896f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Received unexpected event network-vif-plugged-f4b71a00-88ba-4e02-82f2-54866d84d7bd for instance with vm_state deleted and task_state None.
Jan 12 13:48:52 compute-0 nova_compute[181978]: 2026-01-12 13:48:52.218 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:53 compute-0 podman[212513]: 2026-01-12 13:48:53.548681352 +0000 UTC m=+0.040824831 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 12 13:48:53 compute-0 nova_compute[181978]: 2026-01-12 13:48:53.860 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:53 compute-0 nova_compute[181978]: 2026-01-12 13:48:53.934 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:55 compute-0 nova_compute[181978]: 2026-01-12 13:48:55.064 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:48:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:48:57 compute-0 nova_compute[181978]: 2026-01-12 13:48:57.220 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:48:59 compute-0 nova_compute[181978]: 2026-01-12 13:48:59.195 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225724.1937525, 85b0aac7-4573-4a0a-953f-2061684396fa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:48:59 compute-0 nova_compute[181978]: 2026-01-12 13:48:59.196 181991 INFO nova.compute.manager [-] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] VM Stopped (Lifecycle Event)
Jan 12 13:48:59 compute-0 nova_compute[181978]: 2026-01-12 13:48:59.213 181991 DEBUG nova.compute.manager [None req-cc361be6-be4e-4a61-bc55-630383f2dab3 - - - - - -] [instance: 85b0aac7-4573-4a0a-953f-2061684396fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:00 compute-0 nova_compute[181978]: 2026-01-12 13:49:00.066 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:02 compute-0 nova_compute[181978]: 2026-01-12 13:49:02.222 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:04 compute-0 podman[212530]: 2026-01-12 13:49:04.541620399 +0000 UTC m=+0.036927585 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 12 13:49:05 compute-0 nova_compute[181978]: 2026-01-12 13:49:05.048 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225730.0474334, e21c2b66-4a73-4093-b44b-c47371cf431e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:05 compute-0 nova_compute[181978]: 2026-01-12 13:49:05.049 181991 INFO nova.compute.manager [-] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] VM Stopped (Lifecycle Event)
Jan 12 13:49:05 compute-0 nova_compute[181978]: 2026-01-12 13:49:05.068 181991 DEBUG nova.compute.manager [None req-1062eaf2-4015-48e3-bb90-8d842dab54ee - - - - - -] [instance: e21c2b66-4a73-4093-b44b-c47371cf431e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:05 compute-0 nova_compute[181978]: 2026-01-12 13:49:05.069 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:07 compute-0 nova_compute[181978]: 2026-01-12 13:49:07.223 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:07 compute-0 podman[212552]: 2026-01-12 13:49:07.546698343 +0000 UTC m=+0.041410473 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.671 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "1bf751c2-8932-4459-8100-733329db21d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.672 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.684 181991 DEBUG nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.738 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.738 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.743 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.744 181991 INFO nova.compute.claims [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.823 181991 DEBUG nova.compute.provider_tree [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.840 181991 DEBUG nova.scheduler.client.report [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.866 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.866 181991 DEBUG nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.909 181991 DEBUG nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.909 181991 DEBUG nova.network.neutron [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.925 181991 INFO nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:49:08 compute-0 nova_compute[181978]: 2026-01-12 13:49:08.936 181991 DEBUG nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.006 181991 DEBUG nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.007 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.007 181991 INFO nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Creating image(s)
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.007 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.008 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.008 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.019 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.063 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.064 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.064 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.073 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.115 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.115 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.136 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.137 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.137 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.149 181991 DEBUG nova.policy [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.179 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.180 181991 DEBUG nova.virt.disk.api [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.180 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.222 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.223 181991 DEBUG nova.virt.disk.api [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.223 181991 DEBUG nova.objects.instance [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid 1bf751c2-8932-4459-8100-733329db21d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.236 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.236 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Ensure instance console log exists: /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.237 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.237 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.237 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.793 181991 DEBUG nova.network.neutron [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Successfully updated port: c5f2f44d-4e4c-448c-a052-e62a6d63a943 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.805 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.805 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.806 181991 DEBUG nova.network.neutron [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.874 181991 DEBUG nova.compute.manager [req-666b7fc1-1548-413e-b579-2d24148b7e89 req-0671a5d4-51bd-42ef-928c-9c5753db897f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Received event network-changed-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.874 181991 DEBUG nova.compute.manager [req-666b7fc1-1548-413e-b579-2d24148b7e89 req-0671a5d4-51bd-42ef-928c-9c5753db897f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Refreshing instance network info cache due to event network-changed-c5f2f44d-4e4c-448c-a052-e62a6d63a943. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.875 181991 DEBUG oslo_concurrency.lockutils [req-666b7fc1-1548-413e-b579-2d24148b7e89 req-0671a5d4-51bd-42ef-928c-9c5753db897f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:49:09 compute-0 nova_compute[181978]: 2026-01-12 13:49:09.935 181991 DEBUG nova.network.neutron [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.071 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.396 181991 DEBUG nova.network.neutron [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Updating instance_info_cache with network_info: [{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.419 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.420 181991 DEBUG nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Instance network_info: |[{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.420 181991 DEBUG oslo_concurrency.lockutils [req-666b7fc1-1548-413e-b579-2d24148b7e89 req-0671a5d4-51bd-42ef-928c-9c5753db897f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.420 181991 DEBUG nova.network.neutron [req-666b7fc1-1548-413e-b579-2d24148b7e89 req-0671a5d4-51bd-42ef-928c-9c5753db897f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Refreshing network info cache for port c5f2f44d-4e4c-448c-a052-e62a6d63a943 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.423 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Start _get_guest_xml network_info=[{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.426 181991 WARNING nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.430 181991 DEBUG nova.virt.libvirt.host [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.430 181991 DEBUG nova.virt.libvirt.host [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.432 181991 DEBUG nova.virt.libvirt.host [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.433 181991 DEBUG nova.virt.libvirt.host [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.433 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.433 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.434 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.434 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.434 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.434 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.434 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.434 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.435 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.435 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.435 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.435 181991 DEBUG nova.virt.hardware [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.438 181991 DEBUG nova.virt.libvirt.vif [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-389815740',display_name='tempest-TestNetworkBasicOps-server-389815740',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-389815740',id=8,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEfnwpwBNKCJuE0cPeoM+eRFkr56r4NUuRqkqHjOe3tsX+7KqTpk6Oqv5kxIgXS8fX+siRib2qWddB/xouIwjF2oMeUR/gHNkdTVJ6Cu72owoD4rZLUDpETUMif6aSrH7g==',key_name='tempest-TestNetworkBasicOps-101412417',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-vpd8qd1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:49:08Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=1bf751c2-8932-4459-8100-733329db21d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.438 181991 DEBUG nova.network.os_vif_util [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.439 181991 DEBUG nova.network.os_vif_util [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.440 181991 DEBUG nova.objects.instance [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid 1bf751c2-8932-4459-8100-733329db21d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.449 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <uuid>1bf751c2-8932-4459-8100-733329db21d2</uuid>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <name>instance-00000008</name>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-389815740</nova:name>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:49:10</nova:creationTime>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:49:10 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:49:10 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:49:10 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:49:10 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:49:10 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:49:10 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:49:10 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:49:10 compute-0 nova_compute[181978]:         <nova:port uuid="c5f2f44d-4e4c-448c-a052-e62a6d63a943">
Jan 12 13:49:10 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <system>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <entry name="serial">1bf751c2-8932-4459-8100-733329db21d2</entry>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <entry name="uuid">1bf751c2-8932-4459-8100-733329db21d2</entry>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     </system>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <os>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   </os>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <features>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   </features>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk.config"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:44:ae:f5"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <target dev="tapc5f2f44d-4e"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/console.log" append="off"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <video>
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     </video>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:49:10 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:49:10 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:49:10 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:49:10 compute-0 nova_compute[181978]: </domain>
Jan 12 13:49:10 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.450 181991 DEBUG nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Preparing to wait for external event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.450 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "1bf751c2-8932-4459-8100-733329db21d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.451 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.451 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.451 181991 DEBUG nova.virt.libvirt.vif [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-389815740',display_name='tempest-TestNetworkBasicOps-server-389815740',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-389815740',id=8,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEfnwpwBNKCJuE0cPeoM+eRFkr56r4NUuRqkqHjOe3tsX+7KqTpk6Oqv5kxIgXS8fX+siRib2qWddB/xouIwjF2oMeUR/gHNkdTVJ6Cu72owoD4rZLUDpETUMif6aSrH7g==',key_name='tempest-TestNetworkBasicOps-101412417',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-vpd8qd1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:49:08Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=1bf751c2-8932-4459-8100-733329db21d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.451 181991 DEBUG nova.network.os_vif_util [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.452 181991 DEBUG nova.network.os_vif_util [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.452 181991 DEBUG os_vif [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.453 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.453 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.453 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.455 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.455 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5f2f44d-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.456 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc5f2f44d-4e, col_values=(('external_ids', {'iface-id': 'c5f2f44d-4e4c-448c-a052-e62a6d63a943', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:ae:f5', 'vm-uuid': '1bf751c2-8932-4459-8100-733329db21d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.457 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 NetworkManager[55211]: <info>  [1768225750.4576] manager: (tapc5f2f44d-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.459 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.462 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.463 181991 INFO os_vif [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e')
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.506 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.507 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.507 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:44:ae:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.507 181991 INFO nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Using config drive
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.739 181991 INFO nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Creating config drive at /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk.config
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.743 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe9jctx0t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.859 181991 DEBUG oslo_concurrency.processutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe9jctx0t" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:10 compute-0 kernel: tapc5f2f44d-4e: entered promiscuous mode
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.893 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 NetworkManager[55211]: <info>  [1768225750.8951] manager: (tapc5f2f44d-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.895 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 ovn_controller[94974]: 2026-01-12T13:49:10Z|00120|binding|INFO|Claiming lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 for this chassis.
Jan 12 13:49:10 compute-0 ovn_controller[94974]: 2026-01-12T13:49:10Z|00121|binding|INFO|c5f2f44d-4e4c-448c-a052-e62a6d63a943: Claiming fa:16:3e:44:ae:f5 10.100.0.7
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.903 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.904 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:ae:f5 10.100.0.7'], port_security=['fa:16:3e:44:ae:f5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1450876522', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1bf751c2-8932-4459-8100-733329db21d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-087938fb-9a2d-44f8-8567-4aec6e16757d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1450876522', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e7848670-66d3-47c2-aa04-0080edfddbef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bdf92ca-2ac3-454f-8541-fbb26c1056cb, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=c5f2f44d-4e4c-448c-a052-e62a6d63a943) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.905 104189 INFO neutron.agent.ovn.metadata.agent [-] Port c5f2f44d-4e4c-448c-a052-e62a6d63a943 in datapath 087938fb-9a2d-44f8-8567-4aec6e16757d bound to our chassis
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.905 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 087938fb-9a2d-44f8-8567-4aec6e16757d
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.914 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[231d8cb6-7189-446e-ba5a-a53149233ee6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.914 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap087938fb-91 in ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.916 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap087938fb-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.916 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa5fb0f-424d-48ec-9267-fec2ef668deb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:10 compute-0 systemd-udevd[212603]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.917 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1f1fe6ad-9e3c-40d7-805d-24d5fe8ca71e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.927 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae7413b-ad22-4fa1-ab93-20af55176771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:10 compute-0 NetworkManager[55211]: <info>  [1768225750.9354] device (tapc5f2f44d-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:49:10 compute-0 systemd-machined[153581]: New machine qemu-8-instance-00000008.
Jan 12 13:49:10 compute-0 NetworkManager[55211]: <info>  [1768225750.9381] device (tapc5f2f44d-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:49:10 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.951 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a06248-efe0-42fa-a8af-3bbb35d43a3b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.953 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 ovn_controller[94974]: 2026-01-12T13:49:10Z|00122|binding|INFO|Setting lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 ovn-installed in OVS
Jan 12 13:49:10 compute-0 ovn_controller[94974]: 2026-01-12T13:49:10Z|00123|binding|INFO|Setting lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 up in Southbound
Jan 12 13:49:10 compute-0 nova_compute[181978]: 2026-01-12 13:49:10.958 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.973 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[6844b0b5-72a8-4cb3-935e-b99e6bd77c99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.977 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa7a14d-705f-4492-bfd7-e18528749fbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:10 compute-0 NetworkManager[55211]: <info>  [1768225750.9783] manager: (tap087938fb-90): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Jan 12 13:49:10 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:10.997 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e40df3-dec6-4229-b03e-43a8af56b17b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.000 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[dafeaca0-9979-4c62-9c0a-b04eaea0c0b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:11 compute-0 NetworkManager[55211]: <info>  [1768225751.0126] device (tap087938fb-90): carrier: link connected
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.015 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[84c92b1a-0158-45bf-82cc-6714579d8f2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.028 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[851dbe13-a635-4716-bed1-e20db750ce78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap087938fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:d6:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 281703, 'reachable_time': 40111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212629, 'error': None, 'target': 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.037 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[19b97036-2293-46f4-8f4a-0474cd3f917c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:d6b9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 281703, 'tstamp': 281703}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212630, 'error': None, 'target': 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.048 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[5eafa472-6bec-413e-920a-aa0c0c7dd846]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap087938fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:d6:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 281703, 'reachable_time': 40111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212631, 'error': None, 'target': 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.066 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0f705e-32de-487b-8b1a-dfd7fc4b9990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.100 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[c616f55e-74e8-4186-a69b-c981fc3ec2ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.101 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap087938fb-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.101 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.102 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap087938fb-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:11 compute-0 kernel: tap087938fb-90: entered promiscuous mode
Jan 12 13:49:11 compute-0 NetworkManager[55211]: <info>  [1768225751.1037] manager: (tap087938fb-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.103 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.107 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap087938fb-90, col_values=(('external_ids', {'iface-id': '7e222fb1-af5e-49d1-bde0-22ef72b91e09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:11 compute-0 ovn_controller[94974]: 2026-01-12T13:49:11Z|00124|binding|INFO|Releasing lport 7e222fb1-af5e-49d1-bde0-22ef72b91e09 from this chassis (sb_readonly=0)
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.109 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/087938fb-9a2d-44f8-8567-4aec6e16757d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/087938fb-9a2d-44f8-8567-4aec6e16757d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.110 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[b9af3c46-4bb7-499f-a3d3-6d2e79856519]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.110 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-087938fb-9a2d-44f8-8567-4aec6e16757d
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/087938fb-9a2d-44f8-8567-4aec6e16757d.pid.haproxy
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID 087938fb-9a2d-44f8-8567-4aec6e16757d
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:49:11 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:11.112 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'env', 'PROCESS_TAG=haproxy-087938fb-9a2d-44f8-8567-4aec6e16757d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/087938fb-9a2d-44f8-8567-4aec6e16757d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.122 181991 DEBUG nova.compute.manager [req-3a923479-9d8e-43ca-bf5b-3cd084e966b5 req-ee10eb98-6284-4d88-b76d-115cf84b18d0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Received event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.122 181991 DEBUG oslo_concurrency.lockutils [req-3a923479-9d8e-43ca-bf5b-3cd084e966b5 req-ee10eb98-6284-4d88-b76d-115cf84b18d0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "1bf751c2-8932-4459-8100-733329db21d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.123 181991 DEBUG oslo_concurrency.lockutils [req-3a923479-9d8e-43ca-bf5b-3cd084e966b5 req-ee10eb98-6284-4d88-b76d-115cf84b18d0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.123 181991 DEBUG oslo_concurrency.lockutils [req-3a923479-9d8e-43ca-bf5b-3cd084e966b5 req-ee10eb98-6284-4d88-b76d-115cf84b18d0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.123 181991 DEBUG nova.compute.manager [req-3a923479-9d8e-43ca-bf5b-3cd084e966b5 req-ee10eb98-6284-4d88-b76d-115cf84b18d0 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Processing event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.124 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.155 181991 DEBUG nova.network.neutron [req-666b7fc1-1548-413e-b579-2d24148b7e89 req-0671a5d4-51bd-42ef-928c-9c5753db897f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Updated VIF entry in instance network info cache for port c5f2f44d-4e4c-448c-a052-e62a6d63a943. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.156 181991 DEBUG nova.network.neutron [req-666b7fc1-1548-413e-b579-2d24148b7e89 req-0671a5d4-51bd-42ef-928c-9c5753db897f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Updating instance_info_cache with network_info: [{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.168 181991 DEBUG oslo_concurrency.lockutils [req-666b7fc1-1548-413e-b579-2d24148b7e89 req-0671a5d4-51bd-42ef-928c-9c5753db897f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:49:11 compute-0 podman[212659]: 2026-01-12 13:49:11.385646378 +0000 UTC m=+0.030480805 container create fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:49:11 compute-0 systemd[1]: Started libpod-conmon-fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97.scope.
Jan 12 13:49:11 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:49:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f50691b3602adce21271c24da0ab3adff6f97c3420901badbfbadddbe7134966/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:49:11 compute-0 podman[212659]: 2026-01-12 13:49:11.466660916 +0000 UTC m=+0.111495343 container init fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 12 13:49:11 compute-0 podman[212659]: 2026-01-12 13:49:11.370809594 +0000 UTC m=+0.015644022 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:49:11 compute-0 podman[212659]: 2026-01-12 13:49:11.471230116 +0000 UTC m=+0.116064533 container start fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 12 13:49:11 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212671]: [NOTICE]   (212675) : New worker (212677) forked
Jan 12 13:49:11 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212671]: [NOTICE]   (212675) : Loading success.
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.903 181991 DEBUG nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.904 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225751.9028928, 1bf751c2-8932-4459-8100-733329db21d2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.905 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] VM Started (Lifecycle Event)
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.907 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.910 181991 INFO nova.virt.libvirt.driver [-] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Instance spawned successfully.
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.910 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.922 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.926 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.929 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.930 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.930 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.930 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.931 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.931 181991 DEBUG nova.virt.libvirt.driver [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.945 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.945 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225751.9030564, 1bf751c2-8932-4459-8100-733329db21d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.946 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] VM Paused (Lifecycle Event)
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.963 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.965 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225751.9063287, 1bf751c2-8932-4459-8100-733329db21d2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.965 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] VM Resumed (Lifecycle Event)
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.970 181991 INFO nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Took 2.96 seconds to spawn the instance on the hypervisor.
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.970 181991 DEBUG nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.975 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.977 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:49:11 compute-0 nova_compute[181978]: 2026-01-12 13:49:11.994 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:49:12 compute-0 nova_compute[181978]: 2026-01-12 13:49:12.010 181991 INFO nova.compute.manager [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Took 3.30 seconds to build instance.
Jan 12 13:49:12 compute-0 nova_compute[181978]: 2026-01-12 13:49:12.019 181991 DEBUG oslo_concurrency.lockutils [None req-fe7a2a98-ac9b-4bb2-af5c-52a9facdce25 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:12 compute-0 nova_compute[181978]: 2026-01-12 13:49:12.226 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:13 compute-0 nova_compute[181978]: 2026-01-12 13:49:13.171 181991 DEBUG nova.compute.manager [req-d6948eb6-2b2b-4a1c-9162-1f583784edcc req-815466c0-3514-44c1-a1d7-91eec3a2139d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Received event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:13 compute-0 nova_compute[181978]: 2026-01-12 13:49:13.171 181991 DEBUG oslo_concurrency.lockutils [req-d6948eb6-2b2b-4a1c-9162-1f583784edcc req-815466c0-3514-44c1-a1d7-91eec3a2139d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "1bf751c2-8932-4459-8100-733329db21d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:13 compute-0 nova_compute[181978]: 2026-01-12 13:49:13.171 181991 DEBUG oslo_concurrency.lockutils [req-d6948eb6-2b2b-4a1c-9162-1f583784edcc req-815466c0-3514-44c1-a1d7-91eec3a2139d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:13 compute-0 nova_compute[181978]: 2026-01-12 13:49:13.171 181991 DEBUG oslo_concurrency.lockutils [req-d6948eb6-2b2b-4a1c-9162-1f583784edcc req-815466c0-3514-44c1-a1d7-91eec3a2139d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:13 compute-0 nova_compute[181978]: 2026-01-12 13:49:13.171 181991 DEBUG nova.compute.manager [req-d6948eb6-2b2b-4a1c-9162-1f583784edcc req-815466c0-3514-44c1-a1d7-91eec3a2139d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] No waiting events found dispatching network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:49:13 compute-0 nova_compute[181978]: 2026-01-12 13:49:13.171 181991 WARNING nova.compute.manager [req-d6948eb6-2b2b-4a1c-9162-1f583784edcc req-815466c0-3514-44c1-a1d7-91eec3a2139d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Received unexpected event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 for instance with vm_state active and task_state None.
Jan 12 13:49:15 compute-0 nova_compute[181978]: 2026-01-12 13:49:15.458 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:16.396 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:49:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:16.397 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:49:16 compute-0 nova_compute[181978]: 2026-01-12 13:49:16.398 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.226 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:17 compute-0 ovn_controller[94974]: 2026-01-12T13:49:17Z|00125|binding|INFO|Releasing lport 7e222fb1-af5e-49d1-bde0-22ef72b91e09 from this chassis (sb_readonly=0)
Jan 12 13:49:17 compute-0 NetworkManager[55211]: <info>  [1768225757.4131] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 12 13:49:17 compute-0 NetworkManager[55211]: <info>  [1768225757.4138] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.411 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:17 compute-0 ovn_controller[94974]: 2026-01-12T13:49:17Z|00126|binding|INFO|Releasing lport 7e222fb1-af5e-49d1-bde0-22ef72b91e09 from this chassis (sb_readonly=0)
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.442 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.445 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:49:17 compute-0 podman[212691]: 2026-01-12 13:49:17.569469 +0000 UTC m=+0.058451189 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:49:17 compute-0 podman[212695]: 2026-01-12 13:49:17.581313788 +0000 UTC m=+0.064729643 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public)
Jan 12 13:49:17 compute-0 podman[212690]: 2026-01-12 13:49:17.591511179 +0000 UTC m=+0.084686542 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.920 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.921 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquired lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.921 181991 DEBUG nova.network.neutron [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 12 13:49:17 compute-0 nova_compute[181978]: 2026-01-12 13:49:17.921 181991 DEBUG nova.objects.instance [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1bf751c2-8932-4459-8100-733329db21d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.100 181991 DEBUG nova.compute.manager [req-38534286-9df9-4287-8b4c-5ff01a9c5466 req-d354c43e-cf67-4b6e-aff8-3cb33606e527 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Received event network-changed-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.100 181991 DEBUG nova.compute.manager [req-38534286-9df9-4287-8b4c-5ff01a9c5466 req-d354c43e-cf67-4b6e-aff8-3cb33606e527 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Refreshing instance network info cache due to event network-changed-c5f2f44d-4e4c-448c-a052-e62a6d63a943. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.101 181991 DEBUG oslo_concurrency.lockutils [req-38534286-9df9-4287-8b4c-5ff01a9c5466 req-d354c43e-cf67-4b6e-aff8-3cb33606e527 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.276 181991 DEBUG oslo_concurrency.lockutils [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "1bf751c2-8932-4459-8100-733329db21d2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.277 181991 DEBUG oslo_concurrency.lockutils [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.277 181991 DEBUG oslo_concurrency.lockutils [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "1bf751c2-8932-4459-8100-733329db21d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.277 181991 DEBUG oslo_concurrency.lockutils [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.277 181991 DEBUG oslo_concurrency.lockutils [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.278 181991 INFO nova.compute.manager [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Terminating instance
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.278 181991 DEBUG nova.compute.manager [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:49:18 compute-0 kernel: tapc5f2f44d-4e (unregistering): left promiscuous mode
Jan 12 13:49:18 compute-0 NetworkManager[55211]: <info>  [1768225758.2931] device (tapc5f2f44d-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.296 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:18 compute-0 ovn_controller[94974]: 2026-01-12T13:49:18Z|00127|binding|INFO|Releasing lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 from this chassis (sb_readonly=0)
Jan 12 13:49:18 compute-0 ovn_controller[94974]: 2026-01-12T13:49:18Z|00128|binding|INFO|Setting lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 down in Southbound
Jan 12 13:49:18 compute-0 ovn_controller[94974]: 2026-01-12T13:49:18Z|00129|binding|INFO|Removing iface tapc5f2f44d-4e ovn-installed in OVS
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.299 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.303 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:ae:f5 10.100.0.7'], port_security=['fa:16:3e:44:ae:f5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1450876522', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1bf751c2-8932-4459-8100-733329db21d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-087938fb-9a2d-44f8-8567-4aec6e16757d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1450876522', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e7848670-66d3-47c2-aa04-0080edfddbef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bdf92ca-2ac3-454f-8541-fbb26c1056cb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=c5f2f44d-4e4c-448c-a052-e62a6d63a943) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.304 104189 INFO neutron.agent.ovn.metadata.agent [-] Port c5f2f44d-4e4c-448c-a052-e62a6d63a943 in datapath 087938fb-9a2d-44f8-8567-4aec6e16757d unbound from our chassis
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.305 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 087938fb-9a2d-44f8-8567-4aec6e16757d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.306 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8326fa36-d07c-44e2-9e13-5fc44f613e10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.306 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d namespace which is not needed anymore
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.312 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:18 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 12 13:49:18 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 7.412s CPU time.
Jan 12 13:49:18 compute-0 systemd-machined[153581]: Machine qemu-8-instance-00000008 terminated.
Jan 12 13:49:18 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212671]: [NOTICE]   (212675) : haproxy version is 2.8.14-c23fe91
Jan 12 13:49:18 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212671]: [NOTICE]   (212675) : path to executable is /usr/sbin/haproxy
Jan 12 13:49:18 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212671]: [ALERT]    (212675) : Current worker (212677) exited with code 143 (Terminated)
Jan 12 13:49:18 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212671]: [WARNING]  (212675) : All workers exited. Exiting... (0)
Jan 12 13:49:18 compute-0 systemd[1]: libpod-fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97.scope: Deactivated successfully.
Jan 12 13:49:18 compute-0 podman[212771]: 2026-01-12 13:49:18.406401227 +0000 UTC m=+0.034314773 container died fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 12 13:49:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97-userdata-shm.mount: Deactivated successfully.
Jan 12 13:49:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-f50691b3602adce21271c24da0ab3adff6f97c3420901badbfbadddbe7134966-merged.mount: Deactivated successfully.
Jan 12 13:49:18 compute-0 podman[212771]: 2026-01-12 13:49:18.428206442 +0000 UTC m=+0.056119987 container cleanup fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 12 13:49:18 compute-0 systemd[1]: libpod-conmon-fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97.scope: Deactivated successfully.
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.463 181991 DEBUG nova.compute.manager [req-c3a06872-6c2d-4941-80f4-05758d06d13d req-0755d0a7-3b79-4cdd-a323-3bb43e5b1dfb 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Received event network-vif-unplugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.463 181991 DEBUG oslo_concurrency.lockutils [req-c3a06872-6c2d-4941-80f4-05758d06d13d req-0755d0a7-3b79-4cdd-a323-3bb43e5b1dfb 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "1bf751c2-8932-4459-8100-733329db21d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.463 181991 DEBUG oslo_concurrency.lockutils [req-c3a06872-6c2d-4941-80f4-05758d06d13d req-0755d0a7-3b79-4cdd-a323-3bb43e5b1dfb 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.463 181991 DEBUG oslo_concurrency.lockutils [req-c3a06872-6c2d-4941-80f4-05758d06d13d req-0755d0a7-3b79-4cdd-a323-3bb43e5b1dfb 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.464 181991 DEBUG nova.compute.manager [req-c3a06872-6c2d-4941-80f4-05758d06d13d req-0755d0a7-3b79-4cdd-a323-3bb43e5b1dfb 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] No waiting events found dispatching network-vif-unplugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.464 181991 DEBUG nova.compute.manager [req-c3a06872-6c2d-4941-80f4-05758d06d13d req-0755d0a7-3b79-4cdd-a323-3bb43e5b1dfb 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Received event network-vif-unplugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:49:18 compute-0 podman[212795]: 2026-01-12 13:49:18.470552723 +0000 UTC m=+0.024578979 container remove fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.474 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3a6a93a7-d598-464f-97c1-34fd9ec6fe7a]: (4, ('Mon Jan 12 01:49:18 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d (fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97)\nfc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97\nMon Jan 12 01:49:18 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d (fc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97)\nfc60b2f8c45737562abf5ee147c308f7d839eb6e3e762d57bb6a6bcff8a6ad97\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.475 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[204a3940-228d-4b95-babb-ad28886517c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.476 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap087938fb-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.477 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:18 compute-0 kernel: tap087938fb-90: left promiscuous mode
Jan 12 13:49:18 compute-0 NetworkManager[55211]: <info>  [1768225758.4936] manager: (tapc5f2f44d-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.493 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.495 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[7405793b-292c-4153-9458-62d12952d385]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.504 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[bd03960d-f85f-40bb-8c3b-a91ebf5552cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.504 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8cd426-a054-41da-8f82-cd05e3751960]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.516 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab1a81d-d19c-4865-ac45-06f5329e450c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 281699, 'reachable_time': 18702, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212817, 'error': None, 'target': 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:18 compute-0 systemd[1]: run-netns-ovnmeta\x2d087938fb\x2d9a2d\x2d44f8\x2d8567\x2d4aec6e16757d.mount: Deactivated successfully.
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.518 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:49:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:18.518 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[4d596aae-a348-4bb9-99ca-7f938c918d76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.523 181991 INFO nova.virt.libvirt.driver [-] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Instance destroyed successfully.
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.523 181991 DEBUG nova.objects.instance [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid 1bf751c2-8932-4459-8100-733329db21d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.535 181991 DEBUG nova.virt.libvirt.vif [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-389815740',display_name='tempest-TestNetworkBasicOps-server-389815740',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-389815740',id=8,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEfnwpwBNKCJuE0cPeoM+eRFkr56r4NUuRqkqHjOe3tsX+7KqTpk6Oqv5kxIgXS8fX+siRib2qWddB/xouIwjF2oMeUR/gHNkdTVJ6Cu72owoD4rZLUDpETUMif6aSrH7g==',key_name='tempest-TestNetworkBasicOps-101412417',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:49:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-vpd8qd1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:49:11Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=1bf751c2-8932-4459-8100-733329db21d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.536 181991 DEBUG nova.network.os_vif_util [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.536 181991 DEBUG nova.network.os_vif_util [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.537 181991 DEBUG os_vif [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.538 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.538 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5f2f44d-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.540 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.541 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.542 181991 INFO os_vif [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e')
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.543 181991 INFO nova.virt.libvirt.driver [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Deleting instance files /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2_del
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.543 181991 INFO nova.virt.libvirt.driver [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Deletion of /var/lib/nova/instances/1bf751c2-8932-4459-8100-733329db21d2_del complete
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.584 181991 INFO nova.compute.manager [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Took 0.31 seconds to destroy the instance on the hypervisor.
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.585 181991 DEBUG oslo.service.loopingcall [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.585 181991 DEBUG nova.compute.manager [-] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.585 181991 DEBUG nova.network.neutron [-] [instance: 1bf751c2-8932-4459-8100-733329db21d2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.854 181991 DEBUG nova.network.neutron [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Updating instance_info_cache with network_info: [{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.868 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Releasing lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.869 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.869 181991 DEBUG oslo_concurrency.lockutils [req-38534286-9df9-4287-8b4c-5ff01a9c5466 req-d354c43e-cf67-4b6e-aff8-3cb33606e527 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.869 181991 DEBUG nova.network.neutron [req-38534286-9df9-4287-8b4c-5ff01a9c5466 req-d354c43e-cf67-4b6e-aff8-3cb33606e527 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Refreshing network info cache for port c5f2f44d-4e4c-448c-a052-e62a6d63a943 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:49:18 compute-0 nova_compute[181978]: 2026-01-12 13:49:18.870 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.822 181991 DEBUG nova.network.neutron [-] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.833 181991 INFO nova.compute.manager [-] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Took 1.25 seconds to deallocate network for instance.
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.863 181991 DEBUG oslo_concurrency.lockutils [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.863 181991 DEBUG oslo_concurrency.lockutils [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.891 181991 DEBUG nova.scheduler.client.report [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Refreshing inventories for resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.906 181991 DEBUG nova.scheduler.client.report [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Updating ProviderTree inventory for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.907 181991 DEBUG nova.compute.provider_tree [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Updating inventory in ProviderTree for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.918 181991 DEBUG nova.scheduler.client.report [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Refreshing aggregate associations for resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.931 181991 DEBUG nova.scheduler.client.report [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Refreshing trait associations for resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AVX512VPCLMULQDQ,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,HW_CPU_X86_AVX512VAES,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.954 181991 DEBUG nova.compute.provider_tree [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.967 181991 DEBUG nova.scheduler.client.report [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:49:19 compute-0 nova_compute[181978]: 2026-01-12 13:49:19.981 181991 DEBUG oslo_concurrency.lockutils [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.004 181991 INFO nova.scheduler.client.report [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance 1bf751c2-8932-4459-8100-733329db21d2
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.034 181991 DEBUG nova.network.neutron [req-38534286-9df9-4287-8b4c-5ff01a9c5466 req-d354c43e-cf67-4b6e-aff8-3cb33606e527 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Updated VIF entry in instance network info cache for port c5f2f44d-4e4c-448c-a052-e62a6d63a943. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.034 181991 DEBUG nova.network.neutron [req-38534286-9df9-4287-8b4c-5ff01a9c5466 req-d354c43e-cf67-4b6e-aff8-3cb33606e527 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Updating instance_info_cache with network_info: [{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.059 181991 DEBUG oslo_concurrency.lockutils [None req-d7b3cb8b-fa11-4fb7-a135-119e6ab96cad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.060 181991 DEBUG oslo_concurrency.lockutils [req-38534286-9df9-4287-8b4c-5ff01a9c5466 req-d354c43e-cf67-4b6e-aff8-3cb33606e527 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-1bf751c2-8932-4459-8100-733329db21d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.540 181991 DEBUG nova.compute.manager [req-9369397d-cb5b-4ee5-bfcc-1a0f621c4a76 req-e28e445b-68d0-40a5-82eb-bb53be1edc40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Received event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.540 181991 DEBUG oslo_concurrency.lockutils [req-9369397d-cb5b-4ee5-bfcc-1a0f621c4a76 req-e28e445b-68d0-40a5-82eb-bb53be1edc40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "1bf751c2-8932-4459-8100-733329db21d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.540 181991 DEBUG oslo_concurrency.lockutils [req-9369397d-cb5b-4ee5-bfcc-1a0f621c4a76 req-e28e445b-68d0-40a5-82eb-bb53be1edc40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.541 181991 DEBUG oslo_concurrency.lockutils [req-9369397d-cb5b-4ee5-bfcc-1a0f621c4a76 req-e28e445b-68d0-40a5-82eb-bb53be1edc40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "1bf751c2-8932-4459-8100-733329db21d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.541 181991 DEBUG nova.compute.manager [req-9369397d-cb5b-4ee5-bfcc-1a0f621c4a76 req-e28e445b-68d0-40a5-82eb-bb53be1edc40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] No waiting events found dispatching network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:49:20 compute-0 nova_compute[181978]: 2026-01-12 13:49:20.541 181991 WARNING nova.compute.manager [req-9369397d-cb5b-4ee5-bfcc-1a0f621c4a76 req-e28e445b-68d0-40a5-82eb-bb53be1edc40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Received unexpected event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 for instance with vm_state deleted and task_state None.
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.497 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.497 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.497 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.498 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.694 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.695 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5732MB free_disk=73.38068008422852GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.695 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.695 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.733 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.733 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.755 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.761 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.775 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:49:21 compute-0 nova_compute[181978]: 2026-01-12 13:49:21.775 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:22 compute-0 nova_compute[181978]: 2026-01-12 13:49:22.228 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:22 compute-0 nova_compute[181978]: 2026-01-12 13:49:22.770 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:49:22 compute-0 nova_compute[181978]: 2026-01-12 13:49:22.771 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:49:22 compute-0 nova_compute[181978]: 2026-01-12 13:49:22.771 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:49:23 compute-0 nova_compute[181978]: 2026-01-12 13:49:23.539 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.238 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.238 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.249 181991 DEBUG nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.290 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.290 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.295 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.296 181991 INFO nova.compute.claims [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.363 181991 DEBUG nova.compute.provider_tree [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.373 181991 DEBUG nova.scheduler.client.report [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.385 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.386 181991 DEBUG nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.415 181991 DEBUG nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.415 181991 DEBUG nova.network.neutron [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.428 181991 INFO nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.438 181991 DEBUG nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.490 181991 DEBUG nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.491 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.491 181991 INFO nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Creating image(s)
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.492 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.492 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.492 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.502 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.548 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.549 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.550 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:24 compute-0 podman[212825]: 2026-01-12 13:49:24.553565182 +0000 UTC m=+0.037092705 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.559 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.603 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.604 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.624 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.624 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.625 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.669 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.669 181991 DEBUG nova.virt.disk.api [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.670 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.723 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.724 181991 DEBUG nova.virt.disk.api [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.724 181991 DEBUG nova.objects.instance [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid 84bab5e9-2a3f-41cf-98f9-00af683fe4d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.734 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.734 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Ensure instance console log exists: /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.735 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.735 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:24 compute-0 nova_compute[181978]: 2026-01-12 13:49:24.735 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:25 compute-0 nova_compute[181978]: 2026-01-12 13:49:25.399 181991 DEBUG nova.policy [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:49:25 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:25.399 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:27 compute-0 nova_compute[181978]: 2026-01-12 13:49:27.231 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:27 compute-0 nova_compute[181978]: 2026-01-12 13:49:27.377 181991 DEBUG nova.network.neutron [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Successfully updated port: c5f2f44d-4e4c-448c-a052-e62a6d63a943 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:49:27 compute-0 nova_compute[181978]: 2026-01-12 13:49:27.388 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-84bab5e9-2a3f-41cf-98f9-00af683fe4d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:49:27 compute-0 nova_compute[181978]: 2026-01-12 13:49:27.388 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-84bab5e9-2a3f-41cf-98f9-00af683fe4d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:49:27 compute-0 nova_compute[181978]: 2026-01-12 13:49:27.389 181991 DEBUG nova.network.neutron [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:49:27 compute-0 nova_compute[181978]: 2026-01-12 13:49:27.450 181991 DEBUG nova.compute.manager [req-ab4a9141-cf65-4231-bf2f-074f1744a6e3 req-c7434583-c732-461f-b29d-853dbbf2a324 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Received event network-changed-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:27 compute-0 nova_compute[181978]: 2026-01-12 13:49:27.450 181991 DEBUG nova.compute.manager [req-ab4a9141-cf65-4231-bf2f-074f1744a6e3 req-c7434583-c732-461f-b29d-853dbbf2a324 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Refreshing instance network info cache due to event network-changed-c5f2f44d-4e4c-448c-a052-e62a6d63a943. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:49:27 compute-0 nova_compute[181978]: 2026-01-12 13:49:27.450 181991 DEBUG oslo_concurrency.lockutils [req-ab4a9141-cf65-4231-bf2f-074f1744a6e3 req-c7434583-c732-461f-b29d-853dbbf2a324 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-84bab5e9-2a3f-41cf-98f9-00af683fe4d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:49:27 compute-0 nova_compute[181978]: 2026-01-12 13:49:27.516 181991 DEBUG nova.network.neutron [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:49:28 compute-0 nova_compute[181978]: 2026-01-12 13:49:28.540 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.302 181991 DEBUG nova.network.neutron [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Updating instance_info_cache with network_info: [{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.317 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-84bab5e9-2a3f-41cf-98f9-00af683fe4d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.317 181991 DEBUG nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Instance network_info: |[{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.318 181991 DEBUG oslo_concurrency.lockutils [req-ab4a9141-cf65-4231-bf2f-074f1744a6e3 req-c7434583-c732-461f-b29d-853dbbf2a324 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-84bab5e9-2a3f-41cf-98f9-00af683fe4d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.318 181991 DEBUG nova.network.neutron [req-ab4a9141-cf65-4231-bf2f-074f1744a6e3 req-c7434583-c732-461f-b29d-853dbbf2a324 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Refreshing network info cache for port c5f2f44d-4e4c-448c-a052-e62a6d63a943 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.320 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Start _get_guest_xml network_info=[{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.323 181991 WARNING nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.329 181991 DEBUG nova.virt.libvirt.host [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.329 181991 DEBUG nova.virt.libvirt.host [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.331 181991 DEBUG nova.virt.libvirt.host [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.331 181991 DEBUG nova.virt.libvirt.host [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.332 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.332 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.332 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.332 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.333 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.333 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.333 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.333 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.333 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.334 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.334 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.334 181991 DEBUG nova.virt.hardware [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.337 181991 DEBUG nova.virt.libvirt.vif [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:49:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1228041536',display_name='tempest-TestNetworkBasicOps-server-1228041536',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1228041536',id=9,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLcH8SnX1ny+LljpsoBVLn3eoEDwdXwWpPae9qvujLiEK6kCFopKpzzNp2Jjuwn9M3iMy9BH+DLT+6s08X9xoW+n9d4cqAuat3ZOSFnzvy27En1wwVuwGz4jU+N1ZlOG4A==',key_name='tempest-TestNetworkBasicOps-1201816283',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-962v0oae',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:49:24Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=84bab5e9-2a3f-41cf-98f9-00af683fe4d1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.337 181991 DEBUG nova.network.os_vif_util [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.337 181991 DEBUG nova.network.os_vif_util [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.338 181991 DEBUG nova.objects.instance [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid 84bab5e9-2a3f-41cf-98f9-00af683fe4d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.346 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <uuid>84bab5e9-2a3f-41cf-98f9-00af683fe4d1</uuid>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <name>instance-00000009</name>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-1228041536</nova:name>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:49:29</nova:creationTime>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:49:29 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:49:29 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:49:29 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:49:29 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:49:29 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:49:29 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:49:29 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:49:29 compute-0 nova_compute[181978]:         <nova:port uuid="c5f2f44d-4e4c-448c-a052-e62a6d63a943">
Jan 12 13:49:29 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <system>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <entry name="serial">84bab5e9-2a3f-41cf-98f9-00af683fe4d1</entry>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <entry name="uuid">84bab5e9-2a3f-41cf-98f9-00af683fe4d1</entry>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     </system>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <os>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   </os>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <features>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   </features>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk.config"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:44:ae:f5"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <target dev="tapc5f2f44d-4e"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/console.log" append="off"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <video>
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     </video>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:49:29 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:49:29 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:49:29 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:49:29 compute-0 nova_compute[181978]: </domain>
Jan 12 13:49:29 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.347 181991 DEBUG nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Preparing to wait for external event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.347 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.347 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.348 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.348 181991 DEBUG nova.virt.libvirt.vif [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:49:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1228041536',display_name='tempest-TestNetworkBasicOps-server-1228041536',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1228041536',id=9,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLcH8SnX1ny+LljpsoBVLn3eoEDwdXwWpPae9qvujLiEK6kCFopKpzzNp2Jjuwn9M3iMy9BH+DLT+6s08X9xoW+n9d4cqAuat3ZOSFnzvy27En1wwVuwGz4jU+N1ZlOG4A==',key_name='tempest-TestNetworkBasicOps-1201816283',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-962v0oae',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:49:24Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=84bab5e9-2a3f-41cf-98f9-00af683fe4d1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.348 181991 DEBUG nova.network.os_vif_util [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.349 181991 DEBUG nova.network.os_vif_util [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.349 181991 DEBUG os_vif [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.350 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.350 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.350 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.352 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.353 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5f2f44d-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.353 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc5f2f44d-4e, col_values=(('external_ids', {'iface-id': 'c5f2f44d-4e4c-448c-a052-e62a6d63a943', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:ae:f5', 'vm-uuid': '84bab5e9-2a3f-41cf-98f9-00af683fe4d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:29 compute-0 NetworkManager[55211]: <info>  [1768225769.3550] manager: (tapc5f2f44d-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.356 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.357 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.358 181991 INFO os_vif [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e')
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.383 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.384 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.384 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:44:ae:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.384 181991 INFO nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Using config drive
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.624 181991 INFO nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Creating config drive at /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk.config
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.628 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczxdi0e3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.744 181991 DEBUG oslo_concurrency.processutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczxdi0e3" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:29 compute-0 kernel: tapc5f2f44d-4e: entered promiscuous mode
Jan 12 13:49:29 compute-0 NetworkManager[55211]: <info>  [1768225769.7767] manager: (tapc5f2f44d-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.777 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 ovn_controller[94974]: 2026-01-12T13:49:29Z|00130|binding|INFO|Claiming lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 for this chassis.
Jan 12 13:49:29 compute-0 ovn_controller[94974]: 2026-01-12T13:49:29Z|00131|binding|INFO|c5f2f44d-4e4c-448c-a052-e62a6d63a943: Claiming fa:16:3e:44:ae:f5 10.100.0.7
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.783 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:ae:f5 10.100.0.7'], port_security=['fa:16:3e:44:ae:f5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1450876522', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '84bab5e9-2a3f-41cf-98f9-00af683fe4d1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-087938fb-9a2d-44f8-8567-4aec6e16757d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1450876522', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'e7848670-66d3-47c2-aa04-0080edfddbef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bdf92ca-2ac3-454f-8541-fbb26c1056cb, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=c5f2f44d-4e4c-448c-a052-e62a6d63a943) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.784 104189 INFO neutron.agent.ovn.metadata.agent [-] Port c5f2f44d-4e4c-448c-a052-e62a6d63a943 in datapath 087938fb-9a2d-44f8-8567-4aec6e16757d bound to our chassis
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.785 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 087938fb-9a2d-44f8-8567-4aec6e16757d
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.791 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 ovn_controller[94974]: 2026-01-12T13:49:29Z|00132|binding|INFO|Setting lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 ovn-installed in OVS
Jan 12 13:49:29 compute-0 ovn_controller[94974]: 2026-01-12T13:49:29Z|00133|binding|INFO|Setting lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 up in Southbound
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.795 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3357f27f-d2c6-4300-b6bd-41eda0b8187e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.795 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap087938fb-91 in ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.795 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.797 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap087938fb-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.797 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[ec86dc56-f960-4e70-bdf1-461cd126861f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.797 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.798 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c88fba-d73c-4666-bc3b-c2b15560e98b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.799 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 systemd-udevd[212878]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.809 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[c1422f1e-8f0b-4f3e-af0d-5c4d88f24256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 systemd-machined[153581]: New machine qemu-9-instance-00000009.
Jan 12 13:49:29 compute-0 NetworkManager[55211]: <info>  [1768225769.8192] device (tapc5f2f44d-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:49:29 compute-0 NetworkManager[55211]: <info>  [1768225769.8197] device (tapc5f2f44d-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.820 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[07935570-d72f-4fb5-a8d7-f0b4fbf45f8f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.839 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[da0ea329-3270-40bb-9cb6-91eb30fffddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.843 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[56abbdb4-97aa-4923-925f-083b40f2c784]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 NetworkManager[55211]: <info>  [1768225769.8434] manager: (tap087938fb-90): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Jan 12 13:49:29 compute-0 systemd-udevd[212881]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.865 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6e41e9-cde7-4260-bad9-11e0eb7274fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.867 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[d6caf910-ac82-4075-bc47-c5cc9ba32ce2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 NetworkManager[55211]: <info>  [1768225769.8841] device (tap087938fb-90): carrier: link connected
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.887 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[ba360ad3-6ec6-405f-b093-7d557ba885bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.899 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[a322375a-9a81-4001-9419-1fdd7d338259]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap087938fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:d6:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 283590, 'reachable_time': 31746, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212901, 'error': None, 'target': 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.910 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[a92d18a6-80e6-4562-acc8-517e02bea9c7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:d6b9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 283590, 'tstamp': 283590}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212902, 'error': None, 'target': 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.921 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[67c774e8-a209-4afe-9920-8e7678a53099]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap087938fb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:d6:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 283590, 'reachable_time': 31746, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212903, 'error': None, 'target': 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.939 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e78e9220-daec-4673-afc4-ffb529d95cc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.974 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f07c72-0523-4cd2-86ec-c5a52e6395c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.975 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap087938fb-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.976 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.976 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap087938fb-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:29 compute-0 kernel: tap087938fb-90: entered promiscuous mode
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.977 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 NetworkManager[55211]: <info>  [1768225769.9795] manager: (tap087938fb-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.981 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.982 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap087938fb-90, col_values=(('external_ids', {'iface-id': '7e222fb1-af5e-49d1-bde0-22ef72b91e09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.982 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 ovn_controller[94974]: 2026-01-12T13:49:29Z|00134|binding|INFO|Releasing lport 7e222fb1-af5e-49d1-bde0-22ef72b91e09 from this chassis (sb_readonly=0)
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.983 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.985 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/087938fb-9a2d-44f8-8567-4aec6e16757d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/087938fb-9a2d-44f8-8567-4aec6e16757d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:49:29 compute-0 nova_compute[181978]: 2026-01-12 13:49:29.995 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.994 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[7f487cb6-2ffe-49f8-8ffc-5b74dcddda9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.996 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-087938fb-9a2d-44f8-8567-4aec6e16757d
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/087938fb-9a2d-44f8-8567-4aec6e16757d.pid.haproxy
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID 087938fb-9a2d-44f8-8567-4aec6e16757d
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:49:29 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:29.997 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'env', 'PROCESS_TAG=haproxy-087938fb-9a2d-44f8-8567-4aec6e16757d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/087938fb-9a2d-44f8-8567-4aec6e16757d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:49:30 compute-0 podman[212931]: 2026-01-12 13:49:30.294240855 +0000 UTC m=+0.029818770 container create c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 12 13:49:30 compute-0 systemd[1]: Started libpod-conmon-c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5.scope.
Jan 12 13:49:30 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:49:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/092f2ccf3ab0f55a5d94eef87aadd1645c70eebbfc6ffb0800d4156451333c0c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:49:30 compute-0 podman[212931]: 2026-01-12 13:49:30.358272286 +0000 UTC m=+0.093850190 container init c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 12 13:49:30 compute-0 podman[212931]: 2026-01-12 13:49:30.362956472 +0000 UTC m=+0.098534377 container start c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 12 13:49:30 compute-0 podman[212931]: 2026-01-12 13:49:30.280506464 +0000 UTC m=+0.016084389 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:49:30 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212943]: [NOTICE]   (212947) : New worker (212949) forked
Jan 12 13:49:30 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212943]: [NOTICE]   (212947) : Loading success.
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.401 181991 DEBUG nova.compute.manager [req-ef1bf3fb-b97a-4466-a0aa-4cfd5b4a4f84 req-290a7940-4bd2-4dfb-abfa-340705813877 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Received event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.402 181991 DEBUG oslo_concurrency.lockutils [req-ef1bf3fb-b97a-4466-a0aa-4cfd5b4a4f84 req-290a7940-4bd2-4dfb-abfa-340705813877 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.402 181991 DEBUG oslo_concurrency.lockutils [req-ef1bf3fb-b97a-4466-a0aa-4cfd5b4a4f84 req-290a7940-4bd2-4dfb-abfa-340705813877 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.402 181991 DEBUG oslo_concurrency.lockutils [req-ef1bf3fb-b97a-4466-a0aa-4cfd5b4a4f84 req-290a7940-4bd2-4dfb-abfa-340705813877 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.402 181991 DEBUG nova.compute.manager [req-ef1bf3fb-b97a-4466-a0aa-4cfd5b4a4f84 req-290a7940-4bd2-4dfb-abfa-340705813877 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Processing event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.777 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225770.7776697, 84bab5e9-2a3f-41cf-98f9-00af683fe4d1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.778 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] VM Started (Lifecycle Event)
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.780 181991 DEBUG nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.785 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.788 181991 INFO nova.virt.libvirt.driver [-] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Instance spawned successfully.
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.788 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.802 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.806 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.809 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.810 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.810 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.811 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.811 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.811 181991 DEBUG nova.virt.libvirt.driver [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.828 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.828 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225770.777805, 84bab5e9-2a3f-41cf-98f9-00af683fe4d1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.829 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] VM Paused (Lifecycle Event)
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.846 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.848 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225770.7824407, 84bab5e9-2a3f-41cf-98f9-00af683fe4d1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.848 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] VM Resumed (Lifecycle Event)
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.854 181991 INFO nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Took 6.36 seconds to spawn the instance on the hypervisor.
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.854 181991 DEBUG nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.860 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.862 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.876 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.895 181991 INFO nova.compute.manager [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Took 6.62 seconds to build instance.
Jan 12 13:49:30 compute-0 nova_compute[181978]: 2026-01-12 13:49:30.911 181991 DEBUG oslo_concurrency.lockutils [None req-e9b11d82-21f8-453f-8fd3-d3f259f9f087 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:31 compute-0 nova_compute[181978]: 2026-01-12 13:49:31.288 181991 DEBUG nova.network.neutron [req-ab4a9141-cf65-4231-bf2f-074f1744a6e3 req-c7434583-c732-461f-b29d-853dbbf2a324 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Updated VIF entry in instance network info cache for port c5f2f44d-4e4c-448c-a052-e62a6d63a943. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:49:31 compute-0 nova_compute[181978]: 2026-01-12 13:49:31.289 181991 DEBUG nova.network.neutron [req-ab4a9141-cf65-4231-bf2f-074f1744a6e3 req-c7434583-c732-461f-b29d-853dbbf2a324 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Updating instance_info_cache with network_info: [{"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:31 compute-0 nova_compute[181978]: 2026-01-12 13:49:31.300 181991 DEBUG oslo_concurrency.lockutils [req-ab4a9141-cf65-4231-bf2f-074f1744a6e3 req-c7434583-c732-461f-b29d-853dbbf2a324 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-84bab5e9-2a3f-41cf-98f9-00af683fe4d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.233 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.487 181991 DEBUG nova.compute.manager [req-8b84c824-3225-49ad-ad67-6dd1b131d21a req-661eff8f-d13a-4e06-9251-6ae879e3e95c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Received event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.487 181991 DEBUG oslo_concurrency.lockutils [req-8b84c824-3225-49ad-ad67-6dd1b131d21a req-661eff8f-d13a-4e06-9251-6ae879e3e95c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.487 181991 DEBUG oslo_concurrency.lockutils [req-8b84c824-3225-49ad-ad67-6dd1b131d21a req-661eff8f-d13a-4e06-9251-6ae879e3e95c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.488 181991 DEBUG oslo_concurrency.lockutils [req-8b84c824-3225-49ad-ad67-6dd1b131d21a req-661eff8f-d13a-4e06-9251-6ae879e3e95c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.488 181991 DEBUG nova.compute.manager [req-8b84c824-3225-49ad-ad67-6dd1b131d21a req-661eff8f-d13a-4e06-9251-6ae879e3e95c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] No waiting events found dispatching network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.488 181991 WARNING nova.compute.manager [req-8b84c824-3225-49ad-ad67-6dd1b131d21a req-661eff8f-d13a-4e06-9251-6ae879e3e95c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Received unexpected event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 for instance with vm_state active and task_state None.
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.509 181991 DEBUG oslo_concurrency.lockutils [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.510 181991 DEBUG oslo_concurrency.lockutils [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.510 181991 DEBUG oslo_concurrency.lockutils [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.510 181991 DEBUG oslo_concurrency.lockutils [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.510 181991 DEBUG oslo_concurrency.lockutils [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.511 181991 INFO nova.compute.manager [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Terminating instance
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.512 181991 DEBUG nova.compute.manager [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:49:32 compute-0 kernel: tapc5f2f44d-4e (unregistering): left promiscuous mode
Jan 12 13:49:32 compute-0 NetworkManager[55211]: <info>  [1768225772.5339] device (tapc5f2f44d-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:49:32 compute-0 ovn_controller[94974]: 2026-01-12T13:49:32Z|00135|binding|INFO|Releasing lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 from this chassis (sb_readonly=0)
Jan 12 13:49:32 compute-0 ovn_controller[94974]: 2026-01-12T13:49:32Z|00136|binding|INFO|Setting lport c5f2f44d-4e4c-448c-a052-e62a6d63a943 down in Southbound
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.540 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:32 compute-0 ovn_controller[94974]: 2026-01-12T13:49:32Z|00137|binding|INFO|Removing iface tapc5f2f44d-4e ovn-installed in OVS
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.542 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.544 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:ae:f5 10.100.0.7'], port_security=['fa:16:3e:44:ae:f5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1450876522', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '84bab5e9-2a3f-41cf-98f9-00af683fe4d1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-087938fb-9a2d-44f8-8567-4aec6e16757d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1450876522', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'e7848670-66d3-47c2-aa04-0080edfddbef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.199', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bdf92ca-2ac3-454f-8541-fbb26c1056cb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=c5f2f44d-4e4c-448c-a052-e62a6d63a943) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.545 104189 INFO neutron.agent.ovn.metadata.agent [-] Port c5f2f44d-4e4c-448c-a052-e62a6d63a943 in datapath 087938fb-9a2d-44f8-8567-4aec6e16757d unbound from our chassis
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.546 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 087938fb-9a2d-44f8-8567-4aec6e16757d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.546 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[fd807d92-cbda-4822-bfdb-067bfdde675e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.547 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d namespace which is not needed anymore
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.554 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:32 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 12 13:49:32 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 2.678s CPU time.
Jan 12 13:49:32 compute-0 systemd-machined[153581]: Machine qemu-9-instance-00000009 terminated.
Jan 12 13:49:32 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212943]: [NOTICE]   (212947) : haproxy version is 2.8.14-c23fe91
Jan 12 13:49:32 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212943]: [NOTICE]   (212947) : path to executable is /usr/sbin/haproxy
Jan 12 13:49:32 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212943]: [WARNING]  (212947) : Exiting Master process...
Jan 12 13:49:32 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212943]: [ALERT]    (212947) : Current worker (212949) exited with code 143 (Terminated)
Jan 12 13:49:32 compute-0 neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d[212943]: [WARNING]  (212947) : All workers exited. Exiting... (0)
Jan 12 13:49:32 compute-0 systemd[1]: libpod-c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5.scope: Deactivated successfully.
Jan 12 13:49:32 compute-0 podman[212980]: 2026-01-12 13:49:32.642298557 +0000 UTC m=+0.033711409 container died c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 12 13:49:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5-userdata-shm.mount: Deactivated successfully.
Jan 12 13:49:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-092f2ccf3ab0f55a5d94eef87aadd1645c70eebbfc6ffb0800d4156451333c0c-merged.mount: Deactivated successfully.
Jan 12 13:49:32 compute-0 podman[212980]: 2026-01-12 13:49:32.664540552 +0000 UTC m=+0.055953403 container cleanup c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 12 13:49:32 compute-0 systemd[1]: libpod-conmon-c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5.scope: Deactivated successfully.
Jan 12 13:49:32 compute-0 podman[213006]: 2026-01-12 13:49:32.704397734 +0000 UTC m=+0.024279046 container remove c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.708 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[dacc4f57-d7d7-444f-9ebd-e67c5feab516]: (4, ('Mon Jan 12 01:49:32 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d (c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5)\nc922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5\nMon Jan 12 01:49:32 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d (c922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5)\nc922e74d4a6452e1a14e353b78e8fe5bd47fcf8f05f364b8cde469fdf862d6a5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.709 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[cb165d3c-9961-4ada-82c0-7bf55bc1bad5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.710 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap087938fb-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:32 compute-0 kernel: tap087938fb-90: left promiscuous mode
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.711 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:32 compute-0 NetworkManager[55211]: <info>  [1768225772.7272] manager: (tapc5f2f44d-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.726 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.730 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[53a239bd-5062-4a45-acb2-c78e2d0816e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.738 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[0949cfee-d6cd-4b43-a657-6aa84d9bcf73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.738 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[be86c8f3-1fd1-47a3-b7fd-eb4778add342]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.752 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[844ff58d-3376-42e2-a09c-837243ed8a80]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 283586, 'reachable_time': 30849, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213029, 'error': None, 'target': 'ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:32 compute-0 systemd[1]: run-netns-ovnmeta\x2d087938fb\x2d9a2d\x2d44f8\x2d8567\x2d4aec6e16757d.mount: Deactivated successfully.
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.756 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-087938fb-9a2d-44f8-8567-4aec6e16757d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.757 181991 INFO nova.virt.libvirt.driver [-] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Instance destroyed successfully.
Jan 12 13:49:32 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:32.756 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[81fdc1d7-f85c-4e20-9e73-272832e3fe22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.757 181991 DEBUG nova.objects.instance [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid 84bab5e9-2a3f-41cf-98f9-00af683fe4d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.772 181991 DEBUG nova.virt.libvirt.vif [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:49:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1228041536',display_name='tempest-TestNetworkBasicOps-server-1228041536',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1228041536',id=9,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLcH8SnX1ny+LljpsoBVLn3eoEDwdXwWpPae9qvujLiEK6kCFopKpzzNp2Jjuwn9M3iMy9BH+DLT+6s08X9xoW+n9d4cqAuat3ZOSFnzvy27En1wwVuwGz4jU+N1ZlOG4A==',key_name='tempest-TestNetworkBasicOps-1201816283',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:49:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-962v0oae',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:49:30Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=84bab5e9-2a3f-41cf-98f9-00af683fe4d1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.772 181991 DEBUG nova.network.os_vif_util [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "address": "fa:16:3e:44:ae:f5", "network": {"id": "087938fb-9a2d-44f8-8567-4aec6e16757d", "bridge": "br-int", "label": "tempest-network-smoke--42816525", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f2f44d-4e", "ovs_interfaceid": "c5f2f44d-4e4c-448c-a052-e62a6d63a943", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.773 181991 DEBUG nova.network.os_vif_util [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.773 181991 DEBUG os_vif [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.774 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.774 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5f2f44d-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.775 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.777 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.778 181991 INFO os_vif [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:ae:f5,bridge_name='br-int',has_traffic_filtering=True,id=c5f2f44d-4e4c-448c-a052-e62a6d63a943,network=Network(087938fb-9a2d-44f8-8567-4aec6e16757d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc5f2f44d-4e')
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.779 181991 INFO nova.virt.libvirt.driver [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Deleting instance files /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1_del
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.779 181991 INFO nova.virt.libvirt.driver [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Deletion of /var/lib/nova/instances/84bab5e9-2a3f-41cf-98f9-00af683fe4d1_del complete
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.813 181991 INFO nova.compute.manager [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Took 0.30 seconds to destroy the instance on the hypervisor.
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.813 181991 DEBUG oslo.service.loopingcall [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.813 181991 DEBUG nova.compute.manager [-] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:49:32 compute-0 nova_compute[181978]: 2026-01-12 13:49:32.814 181991 DEBUG nova.network.neutron [-] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:49:33 compute-0 nova_compute[181978]: 2026-01-12 13:49:33.520 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225758.5184014, 1bf751c2-8932-4459-8100-733329db21d2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:33 compute-0 nova_compute[181978]: 2026-01-12 13:49:33.521 181991 INFO nova.compute.manager [-] [instance: 1bf751c2-8932-4459-8100-733329db21d2] VM Stopped (Lifecycle Event)
Jan 12 13:49:33 compute-0 nova_compute[181978]: 2026-01-12 13:49:33.538 181991 DEBUG nova.compute.manager [None req-c324434c-4268-492f-8d17-3ca3db0f95cb - - - - - -] [instance: 1bf751c2-8932-4459-8100-733329db21d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.347 181991 DEBUG nova.network.neutron [-] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.362 181991 INFO nova.compute.manager [-] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Took 1.55 seconds to deallocate network for instance.
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.397 181991 DEBUG oslo_concurrency.lockutils [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.397 181991 DEBUG oslo_concurrency.lockutils [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.441 181991 DEBUG nova.compute.provider_tree [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.451 181991 DEBUG nova.scheduler.client.report [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.465 181991 DEBUG oslo_concurrency.lockutils [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.482 181991 INFO nova.scheduler.client.report [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance 84bab5e9-2a3f-41cf-98f9-00af683fe4d1
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.531 181991 DEBUG oslo_concurrency.lockutils [None req-004097ce-0c4f-4e84-8b2f-d1f4b1df5e1b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.542 181991 DEBUG nova.compute.manager [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Received event network-vif-unplugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.542 181991 DEBUG oslo_concurrency.lockutils [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.542 181991 DEBUG oslo_concurrency.lockutils [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.542 181991 DEBUG oslo_concurrency.lockutils [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.543 181991 DEBUG nova.compute.manager [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] No waiting events found dispatching network-vif-unplugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.543 181991 WARNING nova.compute.manager [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Received unexpected event network-vif-unplugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 for instance with vm_state deleted and task_state None.
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.543 181991 DEBUG nova.compute.manager [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Received event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.543 181991 DEBUG oslo_concurrency.lockutils [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.543 181991 DEBUG oslo_concurrency.lockutils [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.543 181991 DEBUG oslo_concurrency.lockutils [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "84bab5e9-2a3f-41cf-98f9-00af683fe4d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.544 181991 DEBUG nova.compute.manager [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] No waiting events found dispatching network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:49:34 compute-0 nova_compute[181978]: 2026-01-12 13:49:34.544 181991 WARNING nova.compute.manager [req-dafd8d69-ffe0-4592-ad35-955f12a474b5 req-038c5fc1-e4b9-4d1d-8937-6724f351b145 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Received unexpected event network-vif-plugged-c5f2f44d-4e4c-448c-a052-e62a6d63a943 for instance with vm_state deleted and task_state None.
Jan 12 13:49:35 compute-0 podman[213036]: 2026-01-12 13:49:35.544979335 +0000 UTC m=+0.038968933 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 12 13:49:37 compute-0 nova_compute[181978]: 2026-01-12 13:49:37.234 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:37 compute-0 nova_compute[181978]: 2026-01-12 13:49:37.775 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:38 compute-0 podman[213058]: 2026-01-12 13:49:38.573367741 +0000 UTC m=+0.067646556 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:49:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:40.202 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:40.203 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:40.203 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:41 compute-0 nova_compute[181978]: 2026-01-12 13:49:41.548 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:41 compute-0 nova_compute[181978]: 2026-01-12 13:49:41.622 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:42 compute-0 nova_compute[181978]: 2026-01-12 13:49:42.236 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:42 compute-0 nova_compute[181978]: 2026-01-12 13:49:42.776 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:47 compute-0 nova_compute[181978]: 2026-01-12 13:49:47.238 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:47 compute-0 nova_compute[181978]: 2026-01-12 13:49:47.753 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225772.7536528, 84bab5e9-2a3f-41cf-98f9-00af683fe4d1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:47 compute-0 nova_compute[181978]: 2026-01-12 13:49:47.754 181991 INFO nova.compute.manager [-] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] VM Stopped (Lifecycle Event)
Jan 12 13:49:47 compute-0 nova_compute[181978]: 2026-01-12 13:49:47.770 181991 DEBUG nova.compute.manager [None req-864088f7-0d41-4187-9a75-87944c936646 - - - - - -] [instance: 84bab5e9-2a3f-41cf-98f9-00af683fe4d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:47 compute-0 nova_compute[181978]: 2026-01-12 13:49:47.777 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:48 compute-0 podman[213078]: 2026-01-12 13:49:48.553529165 +0000 UTC m=+0.044345830 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, release=1755695350, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 12 13:49:48 compute-0 podman[213076]: 2026-01-12 13:49:48.559979131 +0000 UTC m=+0.054819822 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:49:48 compute-0 podman[213077]: 2026-01-12 13:49:48.577393213 +0000 UTC m=+0.070560829 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.506 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "6e6567cb-2108-4fae-9bf9-0626331454b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.506 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.533 181991 DEBUG nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.589 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.589 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.595 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.595 181991 INFO nova.compute.claims [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.680 181991 DEBUG nova.compute.provider_tree [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.690 181991 DEBUG nova.scheduler.client.report [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.706 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.706 181991 DEBUG nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.744 181991 DEBUG nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.744 181991 DEBUG nova.network.neutron [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.760 181991 INFO nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.773 181991 DEBUG nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.834 181991 DEBUG nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.834 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.835 181991 INFO nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Creating image(s)
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.835 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.835 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.836 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.846 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.865 181991 DEBUG nova.policy [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.889 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.889 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.890 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.899 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.941 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.942 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.959 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk 1073741824" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.960 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:50 compute-0 nova_compute[181978]: 2026-01-12 13:49:50.960 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.002 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.002 181991 DEBUG nova.virt.disk.api [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.003 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.046 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.046 181991 DEBUG nova.virt.disk.api [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.047 181991 DEBUG nova.objects.instance [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid 6e6567cb-2108-4fae-9bf9-0626331454b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.058 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.059 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Ensure instance console log exists: /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.059 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.059 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.060 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:51 compute-0 nova_compute[181978]: 2026-01-12 13:49:51.563 181991 DEBUG nova.network.neutron [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Successfully created port: 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.035 181991 DEBUG nova.network.neutron [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Successfully updated port: 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.052 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.053 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.053 181991 DEBUG nova.network.neutron [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.120 181991 DEBUG nova.compute.manager [req-e6b2a0f9-240c-4ccc-8882-6b46ba4ffea4 req-105e0c13-6bef-48b4-8fd4-67d71b6bd617 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received event network-changed-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.121 181991 DEBUG nova.compute.manager [req-e6b2a0f9-240c-4ccc-8882-6b46ba4ffea4 req-105e0c13-6bef-48b4-8fd4-67d71b6bd617 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Refreshing instance network info cache due to event network-changed-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.121 181991 DEBUG oslo_concurrency.lockutils [req-e6b2a0f9-240c-4ccc-8882-6b46ba4ffea4 req-105e0c13-6bef-48b4-8fd4-67d71b6bd617 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.239 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.276 181991 DEBUG nova.network.neutron [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.778 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.816 181991 DEBUG nova.network.neutron [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Updating instance_info_cache with network_info: [{"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.831 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.831 181991 DEBUG nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Instance network_info: |[{"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.832 181991 DEBUG oslo_concurrency.lockutils [req-e6b2a0f9-240c-4ccc-8882-6b46ba4ffea4 req-105e0c13-6bef-48b4-8fd4-67d71b6bd617 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.832 181991 DEBUG nova.network.neutron [req-e6b2a0f9-240c-4ccc-8882-6b46ba4ffea4 req-105e0c13-6bef-48b4-8fd4-67d71b6bd617 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Refreshing network info cache for port 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.834 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Start _get_guest_xml network_info=[{"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.836 181991 WARNING nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.843 181991 DEBUG nova.virt.libvirt.host [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.843 181991 DEBUG nova.virt.libvirt.host [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.846 181991 DEBUG nova.virt.libvirt.host [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.846 181991 DEBUG nova.virt.libvirt.host [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.846 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.846 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.847 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.847 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.847 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.847 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.847 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.847 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.847 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.848 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.848 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.848 181991 DEBUG nova.virt.hardware [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.850 181991 DEBUG nova.virt.libvirt.vif [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:49:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1322462850',display_name='tempest-TestNetworkBasicOps-server-1322462850',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1322462850',id=10,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLoerwcayeO/E/Bo2LGcwnlnYesIpAIQ4jTedkwknPNylnVhyIdeopuprjbjB9Bb7PZAxWqSWQdSpXN6bwm1wz39bW5nBT7oPVdPhM15v8/+bpkH4qHppXe0vLeKCgRbkg==',key_name='tempest-TestNetworkBasicOps-1687262822',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-jkyl5ryb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:49:50Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=6e6567cb-2108-4fae-9bf9-0626331454b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.851 181991 DEBUG nova.network.os_vif_util [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.851 181991 DEBUG nova.network.os_vif_util [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:93:af,bridge_name='br-int',has_traffic_filtering=True,id=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7,network=Network(096f8666-1516-4762-82d5-078632578ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3908623a-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.852 181991 DEBUG nova.objects.instance [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e6567cb-2108-4fae-9bf9-0626331454b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.862 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <uuid>6e6567cb-2108-4fae-9bf9-0626331454b8</uuid>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <name>instance-0000000a</name>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-1322462850</nova:name>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:49:52</nova:creationTime>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:49:52 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:49:52 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:49:52 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:49:52 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:49:52 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:49:52 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:49:52 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:49:52 compute-0 nova_compute[181978]:         <nova:port uuid="3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7">
Jan 12 13:49:52 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <system>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <entry name="serial">6e6567cb-2108-4fae-9bf9-0626331454b8</entry>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <entry name="uuid">6e6567cb-2108-4fae-9bf9-0626331454b8</entry>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     </system>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <os>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   </os>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <features>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   </features>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk.config"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:47:93:af"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <target dev="tap3908623a-3c"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/console.log" append="off"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <video>
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     </video>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:49:52 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:49:52 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:49:52 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:49:52 compute-0 nova_compute[181978]: </domain>
Jan 12 13:49:52 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.862 181991 DEBUG nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Preparing to wait for external event network-vif-plugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.862 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.862 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.863 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.863 181991 DEBUG nova.virt.libvirt.vif [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:49:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1322462850',display_name='tempest-TestNetworkBasicOps-server-1322462850',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1322462850',id=10,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLoerwcayeO/E/Bo2LGcwnlnYesIpAIQ4jTedkwknPNylnVhyIdeopuprjbjB9Bb7PZAxWqSWQdSpXN6bwm1wz39bW5nBT7oPVdPhM15v8/+bpkH4qHppXe0vLeKCgRbkg==',key_name='tempest-TestNetworkBasicOps-1687262822',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-jkyl5ryb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:49:50Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=6e6567cb-2108-4fae-9bf9-0626331454b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.863 181991 DEBUG nova.network.os_vif_util [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.864 181991 DEBUG nova.network.os_vif_util [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:93:af,bridge_name='br-int',has_traffic_filtering=True,id=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7,network=Network(096f8666-1516-4762-82d5-078632578ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3908623a-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.864 181991 DEBUG os_vif [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:93:af,bridge_name='br-int',has_traffic_filtering=True,id=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7,network=Network(096f8666-1516-4762-82d5-078632578ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3908623a-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.864 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.864 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.865 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.866 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.866 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3908623a-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.867 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3908623a-3c, col_values=(('external_ids', {'iface-id': '3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:93:af', 'vm-uuid': '6e6567cb-2108-4fae-9bf9-0626331454b8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.868 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:52 compute-0 NetworkManager[55211]: <info>  [1768225792.8688] manager: (tap3908623a-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.870 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.872 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.872 181991 INFO os_vif [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:93:af,bridge_name='br-int',has_traffic_filtering=True,id=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7,network=Network(096f8666-1516-4762-82d5-078632578ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3908623a-3c')
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.904 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.904 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.905 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:47:93:af, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:49:52 compute-0 nova_compute[181978]: 2026-01-12 13:49:52.905 181991 INFO nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Using config drive
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.121 181991 INFO nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Creating config drive at /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk.config
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.125 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpktj7g_a1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.240 181991 DEBUG oslo_concurrency.processutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpktj7g_a1" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:49:53 compute-0 kernel: tap3908623a-3c: entered promiscuous mode
Jan 12 13:49:53 compute-0 NetworkManager[55211]: <info>  [1768225793.2693] manager: (tap3908623a-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Jan 12 13:49:53 compute-0 ovn_controller[94974]: 2026-01-12T13:49:53Z|00138|binding|INFO|Claiming lport 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 for this chassis.
Jan 12 13:49:53 compute-0 ovn_controller[94974]: 2026-01-12T13:49:53Z|00139|binding|INFO|3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7: Claiming fa:16:3e:47:93:af 10.100.0.4
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.273 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.287 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:93:af 10.100.0.4'], port_security=['fa:16:3e:47:93:af 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '6e6567cb-2108-4fae-9bf9-0626331454b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-096f8666-1516-4762-82d5-078632578ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b2d7871-c305-4fe8-9a2c-587d838778c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b34c6b7-161f-4365-a754-5cc4edcc3dcb, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.288 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 in datapath 096f8666-1516-4762-82d5-078632578ca8 bound to our chassis
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.289 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 096f8666-1516-4762-82d5-078632578ca8
Jan 12 13:49:53 compute-0 systemd-udevd[213173]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.296 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3e00ba93-f7fa-4d97-9780-1ee1f31d1844]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.297 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap096f8666-11 in ovnmeta-096f8666-1516-4762-82d5-078632578ca8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.298 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap096f8666-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.298 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[edaf362a-c074-4c25-97b5-22fa81311b90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.298 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[86b7c387-7eeb-497d-b18f-83d27a688a8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 NetworkManager[55211]: <info>  [1768225793.3037] device (tap3908623a-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:49:53 compute-0 NetworkManager[55211]: <info>  [1768225793.3041] device (tap3908623a-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.307 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[04893c4d-478d-4750-8584-f33cd08d55c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 systemd-machined[153581]: New machine qemu-10-instance-0000000a.
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.329 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[22e51973-421b-48b3-bf4c-a449902b3c0f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.330 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:53 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.334 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:53 compute-0 ovn_controller[94974]: 2026-01-12T13:49:53Z|00140|binding|INFO|Setting lport 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 ovn-installed in OVS
Jan 12 13:49:53 compute-0 ovn_controller[94974]: 2026-01-12T13:49:53Z|00141|binding|INFO|Setting lport 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 up in Southbound
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.338 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.348 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[d157cbc9-050a-4b06-ad15-4dc4fd989689]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.352 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d487dac5-c4ff-478b-84dc-f24108dbb9a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 NetworkManager[55211]: <info>  [1768225793.3530] manager: (tap096f8666-10): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Jan 12 13:49:53 compute-0 systemd-udevd[213177]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.375 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[b230bd1c-897d-4413-978f-b3ac43636416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.377 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[5c88dd01-e244-42bf-a9e0-f816047e23f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 NetworkManager[55211]: <info>  [1768225793.3922] device (tap096f8666-10): carrier: link connected
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.395 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[5a79ece2-d26d-4bc6-8b96-294437039c2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.407 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[f131b490-642f-4096-8e3a-ee1185bb72ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap096f8666-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:03:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 285941, 'reachable_time': 32545, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213200, 'error': None, 'target': 'ovnmeta-096f8666-1516-4762-82d5-078632578ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.416 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8c5bfd41-e957-4b7a-b562-0dc0d119245c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:32a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 285941, 'tstamp': 285941}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213201, 'error': None, 'target': 'ovnmeta-096f8666-1516-4762-82d5-078632578ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.425 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3f48a328-cdd7-4d6a-a1d9-201dae0e08a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap096f8666-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:03:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 285941, 'reachable_time': 32545, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213202, 'error': None, 'target': 'ovnmeta-096f8666-1516-4762-82d5-078632578ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.442 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1c07f231-920b-4aaa-8092-0f248cf8977e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.472 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[fd42955d-45a4-450d-a4b4-2cc8e90edc25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.473 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap096f8666-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.473 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.473 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap096f8666-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:53 compute-0 kernel: tap096f8666-10: entered promiscuous mode
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.475 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:53 compute-0 NetworkManager[55211]: <info>  [1768225793.4759] manager: (tap096f8666-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.479 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap096f8666-10, col_values=(('external_ids', {'iface-id': '6caf6c20-c6d9-4587-b015-9d26982c15d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.479 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:53 compute-0 ovn_controller[94974]: 2026-01-12T13:49:53Z|00142|binding|INFO|Releasing lport 6caf6c20-c6d9-4587-b015-9d26982c15d5 from this chassis (sb_readonly=0)
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.480 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/096f8666-1516-4762-82d5-078632578ca8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/096f8666-1516-4762-82d5-078632578ca8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.481 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[abeb1928-32e5-4f77-8c9f-0a5a6a147678]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.481 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-096f8666-1516-4762-82d5-078632578ca8
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/096f8666-1516-4762-82d5-078632578ca8.pid.haproxy
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID 096f8666-1516-4762-82d5-078632578ca8
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:49:53 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:49:53.483 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-096f8666-1516-4762-82d5-078632578ca8', 'env', 'PROCESS_TAG=haproxy-096f8666-1516-4762-82d5-078632578ca8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/096f8666-1516-4762-82d5-078632578ca8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.492 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.575 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225793.5745347, 6e6567cb-2108-4fae-9bf9-0626331454b8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.575 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] VM Started (Lifecycle Event)
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.592 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.594 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225793.5746164, 6e6567cb-2108-4fae-9bf9-0626331454b8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.594 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] VM Paused (Lifecycle Event)
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.605 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.607 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.619 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:49:53 compute-0 podman[213237]: 2026-01-12 13:49:53.758904148 +0000 UTC m=+0.028883187 container create cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.776 181991 DEBUG nova.network.neutron [req-e6b2a0f9-240c-4ccc-8882-6b46ba4ffea4 req-105e0c13-6bef-48b4-8fd4-67d71b6bd617 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Updated VIF entry in instance network info cache for port 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.776 181991 DEBUG nova.network.neutron [req-e6b2a0f9-240c-4ccc-8882-6b46ba4ffea4 req-105e0c13-6bef-48b4-8fd4-67d71b6bd617 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Updating instance_info_cache with network_info: [{"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:49:53 compute-0 systemd[1]: Started libpod-conmon-cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446.scope.
Jan 12 13:49:53 compute-0 nova_compute[181978]: 2026-01-12 13:49:53.787 181991 DEBUG oslo_concurrency.lockutils [req-e6b2a0f9-240c-4ccc-8882-6b46ba4ffea4 req-105e0c13-6bef-48b4-8fd4-67d71b6bd617 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:49:53 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:49:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9633b40c0da97379af7efd71e0c3715083ae7683428f575a6ea1b225aedf4a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:49:53 compute-0 podman[213237]: 2026-01-12 13:49:53.816095958 +0000 UTC m=+0.086074998 container init cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 12 13:49:53 compute-0 podman[213237]: 2026-01-12 13:49:53.820159979 +0000 UTC m=+0.090139019 container start cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 12 13:49:53 compute-0 podman[213237]: 2026-01-12 13:49:53.745549338 +0000 UTC m=+0.015528397 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:49:53 compute-0 neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8[213249]: [NOTICE]   (213253) : New worker (213255) forked
Jan 12 13:49:53 compute-0 neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8[213249]: [NOTICE]   (213253) : Loading success.
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.181 181991 DEBUG nova.compute.manager [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received event network-vif-plugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.182 181991 DEBUG oslo_concurrency.lockutils [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.183 181991 DEBUG oslo_concurrency.lockutils [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.184 181991 DEBUG oslo_concurrency.lockutils [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.185 181991 DEBUG nova.compute.manager [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Processing event network-vif-plugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.186 181991 DEBUG nova.compute.manager [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received event network-vif-plugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.186 181991 DEBUG oslo_concurrency.lockutils [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.187 181991 DEBUG oslo_concurrency.lockutils [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.187 181991 DEBUG oslo_concurrency.lockutils [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.188 181991 DEBUG nova.compute.manager [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] No waiting events found dispatching network-vif-plugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.189 181991 WARNING nova.compute.manager [req-93d3b370-5fa5-4631-8d38-77e2023db3c0 req-800c2b7b-047c-4694-98f0-e9f847b58649 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received unexpected event network-vif-plugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 for instance with vm_state building and task_state spawning.
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.192 181991 DEBUG nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.199 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225794.1997383, 6e6567cb-2108-4fae-9bf9-0626331454b8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.200 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] VM Resumed (Lifecycle Event)
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.201 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.205 181991 INFO nova.virt.libvirt.driver [-] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Instance spawned successfully.
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.206 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.216 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.220 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.223 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.223 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.224 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.224 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.224 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.225 181991 DEBUG nova.virt.libvirt.driver [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.240 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.261 181991 INFO nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Took 3.43 seconds to spawn the instance on the hypervisor.
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.261 181991 DEBUG nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.298 181991 INFO nova.compute.manager [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Took 3.73 seconds to build instance.
Jan 12 13:49:54 compute-0 nova_compute[181978]: 2026-01-12 13:49:54.308 181991 DEBUG oslo_concurrency.lockutils [None req-7a6d331c-f5eb-425c-a396-7cb798b5437f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:49:55 compute-0 podman[213260]: 2026-01-12 13:49:55.551227818 +0000 UTC m=+0.038438391 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 12 13:49:57 compute-0 nova_compute[181978]: 2026-01-12 13:49:57.240 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:57 compute-0 nova_compute[181978]: 2026-01-12 13:49:57.869 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:59 compute-0 ovn_controller[94974]: 2026-01-12T13:49:59Z|00143|binding|INFO|Releasing lport 6caf6c20-c6d9-4587-b015-9d26982c15d5 from this chassis (sb_readonly=0)
Jan 12 13:49:59 compute-0 NetworkManager[55211]: <info>  [1768225799.4384] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 12 13:49:59 compute-0 NetworkManager[55211]: <info>  [1768225799.4391] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 12 13:49:59 compute-0 nova_compute[181978]: 2026-01-12 13:49:59.457 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:59 compute-0 nova_compute[181978]: 2026-01-12 13:49:59.471 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:59 compute-0 ovn_controller[94974]: 2026-01-12T13:49:59Z|00144|binding|INFO|Releasing lport 6caf6c20-c6d9-4587-b015-9d26982c15d5 from this chassis (sb_readonly=0)
Jan 12 13:49:59 compute-0 nova_compute[181978]: 2026-01-12 13:49:59.474 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:49:59 compute-0 nova_compute[181978]: 2026-01-12 13:49:59.659 181991 DEBUG nova.compute.manager [req-e49be6c3-03c5-49b4-8bc2-d0cbd9ffc9fb req-3472a9f6-0aa5-4845-996b-8b0127498f2d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received event network-changed-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:49:59 compute-0 nova_compute[181978]: 2026-01-12 13:49:59.659 181991 DEBUG nova.compute.manager [req-e49be6c3-03c5-49b4-8bc2-d0cbd9ffc9fb req-3472a9f6-0aa5-4845-996b-8b0127498f2d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Refreshing instance network info cache due to event network-changed-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:49:59 compute-0 nova_compute[181978]: 2026-01-12 13:49:59.659 181991 DEBUG oslo_concurrency.lockutils [req-e49be6c3-03c5-49b4-8bc2-d0cbd9ffc9fb req-3472a9f6-0aa5-4845-996b-8b0127498f2d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:49:59 compute-0 nova_compute[181978]: 2026-01-12 13:49:59.660 181991 DEBUG oslo_concurrency.lockutils [req-e49be6c3-03c5-49b4-8bc2-d0cbd9ffc9fb req-3472a9f6-0aa5-4845-996b-8b0127498f2d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:49:59 compute-0 nova_compute[181978]: 2026-01-12 13:49:59.660 181991 DEBUG nova.network.neutron [req-e49be6c3-03c5-49b4-8bc2-d0cbd9ffc9fb req-3472a9f6-0aa5-4845-996b-8b0127498f2d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Refreshing network info cache for port 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:50:01 compute-0 nova_compute[181978]: 2026-01-12 13:50:01.498 181991 DEBUG nova.network.neutron [req-e49be6c3-03c5-49b4-8bc2-d0cbd9ffc9fb req-3472a9f6-0aa5-4845-996b-8b0127498f2d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Updated VIF entry in instance network info cache for port 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:50:01 compute-0 nova_compute[181978]: 2026-01-12 13:50:01.499 181991 DEBUG nova.network.neutron [req-e49be6c3-03c5-49b4-8bc2-d0cbd9ffc9fb req-3472a9f6-0aa5-4845-996b-8b0127498f2d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Updating instance_info_cache with network_info: [{"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:50:01 compute-0 nova_compute[181978]: 2026-01-12 13:50:01.512 181991 DEBUG oslo_concurrency.lockutils [req-e49be6c3-03c5-49b4-8bc2-d0cbd9ffc9fb req-3472a9f6-0aa5-4845-996b-8b0127498f2d 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:50:02 compute-0 nova_compute[181978]: 2026-01-12 13:50:02.241 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:02 compute-0 nova_compute[181978]: 2026-01-12 13:50:02.871 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:05 compute-0 ovn_controller[94974]: 2026-01-12T13:50:05Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:47:93:af 10.100.0.4
Jan 12 13:50:05 compute-0 ovn_controller[94974]: 2026-01-12T13:50:05Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:93:af 10.100.0.4
Jan 12 13:50:06 compute-0 podman[213291]: 2026-01-12 13:50:06.542373614 +0000 UTC m=+0.037280114 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 12 13:50:07 compute-0 nova_compute[181978]: 2026-01-12 13:50:07.243 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:07 compute-0 nova_compute[181978]: 2026-01-12 13:50:07.873 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:09 compute-0 podman[213312]: 2026-01-12 13:50:09.544088168 +0000 UTC m=+0.038093513 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:50:11 compute-0 nova_compute[181978]: 2026-01-12 13:50:11.973 181991 INFO nova.compute.manager [None req-9e2edfe1-2f90-4aab-9b42-62249016784b d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Get console output
Jan 12 13:50:11 compute-0 nova_compute[181978]: 2026-01-12 13:50:11.976 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:50:12 compute-0 nova_compute[181978]: 2026-01-12 13:50:12.246 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:12 compute-0 nova_compute[181978]: 2026-01-12 13:50:12.874 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:13 compute-0 nova_compute[181978]: 2026-01-12 13:50:13.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:13 compute-0 ovn_controller[94974]: 2026-01-12T13:50:13Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:93:af 10.100.0.4
Jan 12 13:50:16 compute-0 ovn_controller[94974]: 2026-01-12T13:50:16Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:93:af 10.100.0.4
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.493 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.494 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.511 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.706 181991 DEBUG nova.compute.manager [req-6dfbe515-0718-4afd-9473-ffb28b2d3c9d req-4d74090a-4125-46ea-8216-90317e77f4ac 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received event network-changed-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.707 181991 DEBUG nova.compute.manager [req-6dfbe515-0718-4afd-9473-ffb28b2d3c9d req-4d74090a-4125-46ea-8216-90317e77f4ac 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Refreshing instance network info cache due to event network-changed-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.707 181991 DEBUG oslo_concurrency.lockutils [req-6dfbe515-0718-4afd-9473-ffb28b2d3c9d req-4d74090a-4125-46ea-8216-90317e77f4ac 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.708 181991 DEBUG oslo_concurrency.lockutils [req-6dfbe515-0718-4afd-9473-ffb28b2d3c9d req-4d74090a-4125-46ea-8216-90317e77f4ac 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.708 181991 DEBUG nova.network.neutron [req-6dfbe515-0718-4afd-9473-ffb28b2d3c9d req-4d74090a-4125-46ea-8216-90317e77f4ac 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Refreshing network info cache for port 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.749 181991 DEBUG oslo_concurrency.lockutils [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "6e6567cb-2108-4fae-9bf9-0626331454b8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.749 181991 DEBUG oslo_concurrency.lockutils [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.749 181991 DEBUG oslo_concurrency.lockutils [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.750 181991 DEBUG oslo_concurrency.lockutils [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.750 181991 DEBUG oslo_concurrency.lockutils [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.751 181991 INFO nova.compute.manager [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Terminating instance
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.752 181991 DEBUG nova.compute.manager [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:50:16 compute-0 kernel: tap3908623a-3c (unregistering): left promiscuous mode
Jan 12 13:50:16 compute-0 NetworkManager[55211]: <info>  [1768225816.7781] device (tap3908623a-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:50:16 compute-0 ovn_controller[94974]: 2026-01-12T13:50:16Z|00145|binding|INFO|Releasing lport 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 from this chassis (sb_readonly=0)
Jan 12 13:50:16 compute-0 ovn_controller[94974]: 2026-01-12T13:50:16Z|00146|binding|INFO|Setting lport 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 down in Southbound
Jan 12 13:50:16 compute-0 ovn_controller[94974]: 2026-01-12T13:50:16Z|00147|binding|INFO|Removing iface tap3908623a-3c ovn-installed in OVS
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.783 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.787 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:93:af 10.100.0.4'], port_security=['fa:16:3e:47:93:af 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '6e6567cb-2108-4fae-9bf9-0626331454b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-096f8666-1516-4762-82d5-078632578ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1b2d7871-c305-4fe8-9a2c-587d838778c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b34c6b7-161f-4365-a754-5cc4edcc3dcb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.788 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 in datapath 096f8666-1516-4762-82d5-078632578ca8 unbound from our chassis
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.789 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 096f8666-1516-4762-82d5-078632578ca8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.790 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[848f4ef5-1923-4117-823b-92618817a634]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.790 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-096f8666-1516-4762-82d5-078632578ca8 namespace which is not needed anymore
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.798 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:16 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 12 13:50:16 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 10.491s CPU time.
Jan 12 13:50:16 compute-0 systemd-machined[153581]: Machine qemu-10-instance-0000000a terminated.
Jan 12 13:50:16 compute-0 neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8[213249]: [NOTICE]   (213253) : haproxy version is 2.8.14-c23fe91
Jan 12 13:50:16 compute-0 neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8[213249]: [NOTICE]   (213253) : path to executable is /usr/sbin/haproxy
Jan 12 13:50:16 compute-0 neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8[213249]: [ALERT]    (213253) : Current worker (213255) exited with code 143 (Terminated)
Jan 12 13:50:16 compute-0 neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8[213249]: [WARNING]  (213253) : All workers exited. Exiting... (0)
Jan 12 13:50:16 compute-0 systemd[1]: libpod-cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446.scope: Deactivated successfully.
Jan 12 13:50:16 compute-0 podman[213351]: 2026-01-12 13:50:16.888891974 +0000 UTC m=+0.036494657 container died cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 12 13:50:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446-userdata-shm.mount: Deactivated successfully.
Jan 12 13:50:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-a9633b40c0da97379af7efd71e0c3715083ae7683428f575a6ea1b225aedf4a3-merged.mount: Deactivated successfully.
Jan 12 13:50:16 compute-0 podman[213351]: 2026-01-12 13:50:16.911803832 +0000 UTC m=+0.059406505 container cleanup cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 12 13:50:16 compute-0 systemd[1]: libpod-conmon-cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446.scope: Deactivated successfully.
Jan 12 13:50:16 compute-0 podman[213375]: 2026-01-12 13:50:16.950374403 +0000 UTC m=+0.023272578 container remove cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.953 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[06a9c2c2-2e90-425e-9800-d011d074dd65]: (4, ('Mon Jan 12 01:50:16 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8 (cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446)\ncf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446\nMon Jan 12 01:50:16 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-096f8666-1516-4762-82d5-078632578ca8 (cf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446)\ncf8368bb9668ef7eabd5111be38e839613ec42f67c2aaebfcee34398dbd11446\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.955 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[2f862f14-5c2a-429d-9988-7461d511bf49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.955 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap096f8666-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:16 compute-0 kernel: tap096f8666-10: left promiscuous mode
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.959 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:16 compute-0 NetworkManager[55211]: <info>  [1768225816.9728] manager: (tap3908623a-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.973 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.974 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.977 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1e76d01a-85c1-4a1f-a891-822c91ffe0b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.986 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b67fb5-1305-43c2-9cba-dab84380f68a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.986 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[23a9a010-b698-46d2-9408-13998de5d5a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.995 181991 DEBUG nova.compute.manager [req-c5841c44-6f8f-4802-8a78-ed8406192893 req-d5336361-d891-48ce-b526-8a2ccd4551ce 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received event network-vif-unplugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.995 181991 DEBUG oslo_concurrency.lockutils [req-c5841c44-6f8f-4802-8a78-ed8406192893 req-d5336361-d891-48ce-b526-8a2ccd4551ce 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.996 181991 DEBUG oslo_concurrency.lockutils [req-c5841c44-6f8f-4802-8a78-ed8406192893 req-d5336361-d891-48ce-b526-8a2ccd4551ce 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.996 181991 DEBUG oslo_concurrency.lockutils [req-c5841c44-6f8f-4802-8a78-ed8406192893 req-d5336361-d891-48ce-b526-8a2ccd4551ce 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.996 181991 DEBUG nova.compute.manager [req-c5841c44-6f8f-4802-8a78-ed8406192893 req-d5336361-d891-48ce-b526-8a2ccd4551ce 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] No waiting events found dispatching network-vif-unplugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:50:16 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.996 181991 DEBUG nova.compute.manager [req-c5841c44-6f8f-4802-8a78-ed8406192893 req-d5336361-d891-48ce-b526-8a2ccd4551ce 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received event network-vif-unplugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:16.999 181991 INFO nova.virt.libvirt.driver [-] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Instance destroyed successfully.
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.000 181991 DEBUG nova.objects.instance [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid 6e6567cb-2108-4fae-9bf9-0626331454b8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:50:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:16.999 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[b9462aa7-8b70-4e9a-a8cc-7bf790a2a80b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 285937, 'reachable_time': 32924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213401, 'error': None, 'target': 'ovnmeta-096f8666-1516-4762-82d5-078632578ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:17.001 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-096f8666-1516-4762-82d5-078632578ca8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:50:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:17.001 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[e7392b8b-27aa-43fd-85f1-d51af15e075c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d096f8666\x2d1516\x2d4762\x2d82d5\x2d078632578ca8.mount: Deactivated successfully.
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.008 181991 DEBUG nova.virt.libvirt.vif [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:49:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1322462850',display_name='tempest-TestNetworkBasicOps-server-1322462850',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1322462850',id=10,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLoerwcayeO/E/Bo2LGcwnlnYesIpAIQ4jTedkwknPNylnVhyIdeopuprjbjB9Bb7PZAxWqSWQdSpXN6bwm1wz39bW5nBT7oPVdPhM15v8/+bpkH4qHppXe0vLeKCgRbkg==',key_name='tempest-TestNetworkBasicOps-1687262822',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:49:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-jkyl5ryb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:49:54Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=6e6567cb-2108-4fae-9bf9-0626331454b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.009 181991 DEBUG nova.network.os_vif_util [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.009 181991 DEBUG nova.network.os_vif_util [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:47:93:af,bridge_name='br-int',has_traffic_filtering=True,id=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7,network=Network(096f8666-1516-4762-82d5-078632578ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3908623a-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.010 181991 DEBUG os_vif [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:93:af,bridge_name='br-int',has_traffic_filtering=True,id=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7,network=Network(096f8666-1516-4762-82d5-078632578ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3908623a-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.011 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.011 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3908623a-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.014 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.016 181991 INFO os_vif [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:93:af,bridge_name='br-int',has_traffic_filtering=True,id=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7,network=Network(096f8666-1516-4762-82d5-078632578ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3908623a-3c')
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.016 181991 INFO nova.virt.libvirt.driver [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Deleting instance files /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8_del
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.016 181991 INFO nova.virt.libvirt.driver [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Deletion of /var/lib/nova/instances/6e6567cb-2108-4fae-9bf9-0626331454b8_del complete
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.053 181991 INFO nova.compute.manager [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Took 0.30 seconds to destroy the instance on the hypervisor.
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.053 181991 DEBUG oslo.service.loopingcall [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.053 181991 DEBUG nova.compute.manager [-] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.053 181991 DEBUG nova.network.neutron [-] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.248 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.416 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:17.416 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:50:17 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:17.417 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.497 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.498 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.498 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.514 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.514 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.772 181991 DEBUG nova.network.neutron [-] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.783 181991 INFO nova.compute.manager [-] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Took 0.73 seconds to deallocate network for instance.
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.814 181991 DEBUG oslo_concurrency.lockutils [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:17 compute-0 nova_compute[181978]: 2026-01-12 13:50:17.814 181991 DEBUG oslo_concurrency.lockutils [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.009 181991 DEBUG nova.network.neutron [req-6dfbe515-0718-4afd-9473-ffb28b2d3c9d req-4d74090a-4125-46ea-8216-90317e77f4ac 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Updated VIF entry in instance network info cache for port 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.009 181991 DEBUG nova.network.neutron [req-6dfbe515-0718-4afd-9473-ffb28b2d3c9d req-4d74090a-4125-46ea-8216-90317e77f4ac 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Updating instance_info_cache with network_info: [{"id": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "address": "fa:16:3e:47:93:af", "network": {"id": "096f8666-1516-4762-82d5-078632578ca8", "bridge": "br-int", "label": "tempest-network-smoke--1047382062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3908623a-3c", "ovs_interfaceid": "3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.012 181991 DEBUG nova.compute.provider_tree [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.024 181991 DEBUG nova.scheduler.client.report [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.027 181991 DEBUG oslo_concurrency.lockutils [req-6dfbe515-0718-4afd-9473-ffb28b2d3c9d req-4d74090a-4125-46ea-8216-90317e77f4ac 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-6e6567cb-2108-4fae-9bf9-0626331454b8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.036 181991 DEBUG oslo_concurrency.lockutils [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.055 181991 INFO nova.scheduler.client.report [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance 6e6567cb-2108-4fae-9bf9-0626331454b8
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.094 181991 DEBUG oslo_concurrency.lockutils [None req-6258ab15-b6a3-4e52-a005-c0ad8b22932d d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.784 181991 DEBUG nova.compute.manager [req-fbba96d2-48ae-4574-aa82-e0eae781b0f6 req-a57cf402-85d2-485b-af27-9f2107999af3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received event network-vif-deleted-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.785 181991 INFO nova.compute.manager [req-fbba96d2-48ae-4574-aa82-e0eae781b0f6 req-a57cf402-85d2-485b-af27-9f2107999af3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Neutron deleted interface 3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7; detaching it from the instance and deleting it from the info cache
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.785 181991 DEBUG nova.network.neutron [req-fbba96d2-48ae-4574-aa82-e0eae781b0f6 req-a57cf402-85d2-485b-af27-9f2107999af3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 12 13:50:18 compute-0 nova_compute[181978]: 2026-01-12 13:50:18.786 181991 DEBUG nova.compute.manager [req-fbba96d2-48ae-4574-aa82-e0eae781b0f6 req-a57cf402-85d2-485b-af27-9f2107999af3 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Detach interface failed, port_id=3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7, reason: Instance 6e6567cb-2108-4fae-9bf9-0626331454b8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 12 13:50:19 compute-0 nova_compute[181978]: 2026-01-12 13:50:19.061 181991 DEBUG nova.compute.manager [req-e9c11720-7569-42db-a837-13acce5aca2b req-cb3a5ebc-838e-4b1b-8491-a0befdc52f43 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received event network-vif-plugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:19 compute-0 nova_compute[181978]: 2026-01-12 13:50:19.061 181991 DEBUG oslo_concurrency.lockutils [req-e9c11720-7569-42db-a837-13acce5aca2b req-cb3a5ebc-838e-4b1b-8491-a0befdc52f43 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:19 compute-0 nova_compute[181978]: 2026-01-12 13:50:19.062 181991 DEBUG oslo_concurrency.lockutils [req-e9c11720-7569-42db-a837-13acce5aca2b req-cb3a5ebc-838e-4b1b-8491-a0befdc52f43 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:19 compute-0 nova_compute[181978]: 2026-01-12 13:50:19.062 181991 DEBUG oslo_concurrency.lockutils [req-e9c11720-7569-42db-a837-13acce5aca2b req-cb3a5ebc-838e-4b1b-8491-a0befdc52f43 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "6e6567cb-2108-4fae-9bf9-0626331454b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:19 compute-0 nova_compute[181978]: 2026-01-12 13:50:19.062 181991 DEBUG nova.compute.manager [req-e9c11720-7569-42db-a837-13acce5aca2b req-cb3a5ebc-838e-4b1b-8491-a0befdc52f43 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] No waiting events found dispatching network-vif-plugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:50:19 compute-0 nova_compute[181978]: 2026-01-12 13:50:19.062 181991 WARNING nova.compute.manager [req-e9c11720-7569-42db-a837-13acce5aca2b req-cb3a5ebc-838e-4b1b-8491-a0befdc52f43 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Received unexpected event network-vif-plugged-3908623a-3c0c-4acf-9b3d-8da5a3a0b0d7 for instance with vm_state deleted and task_state None.
Jan 12 13:50:19 compute-0 podman[213406]: 2026-01-12 13:50:19.56120829 +0000 UTC m=+0.053264781 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:50:19 compute-0 podman[213407]: 2026-01-12 13:50:19.565729511 +0000 UTC m=+0.055612625 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350)
Jan 12 13:50:19 compute-0 podman[213405]: 2026-01-12 13:50:19.594393694 +0000 UTC m=+0.086086301 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller)
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.490 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.491 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.491 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.510 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.511 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.511 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.511 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.737 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.738 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5746MB free_disk=73.38033676147461GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.738 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.738 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.776 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.777 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.792 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.799 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.811 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:50:21 compute-0 nova_compute[181978]: 2026-01-12 13:50:21.812 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:22 compute-0 nova_compute[181978]: 2026-01-12 13:50:22.013 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:22 compute-0 nova_compute[181978]: 2026-01-12 13:50:22.250 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:22 compute-0 nova_compute[181978]: 2026-01-12 13:50:22.529 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:22 compute-0 nova_compute[181978]: 2026-01-12 13:50:22.602 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:23 compute-0 nova_compute[181978]: 2026-01-12 13:50:23.798 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:23 compute-0 nova_compute[181978]: 2026-01-12 13:50:23.813 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:23 compute-0 nova_compute[181978]: 2026-01-12 13:50:23.813 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:24 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:24.418 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:24 compute-0 nova_compute[181978]: 2026-01-12 13:50:24.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:24 compute-0 nova_compute[181978]: 2026-01-12 13:50:24.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:24 compute-0 nova_compute[181978]: 2026-01-12 13:50:24.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:50:26 compute-0 podman[213469]: 2026-01-12 13:50:26.553403727 +0000 UTC m=+0.042476124 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:50:27 compute-0 nova_compute[181978]: 2026-01-12 13:50:27.014 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:27 compute-0 nova_compute[181978]: 2026-01-12 13:50:27.251 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:31 compute-0 nova_compute[181978]: 2026-01-12 13:50:31.998 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225816.9966643, 6e6567cb-2108-4fae-9bf9-0626331454b8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:50:31 compute-0 nova_compute[181978]: 2026-01-12 13:50:31.998 181991 INFO nova.compute.manager [-] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] VM Stopped (Lifecycle Event)
Jan 12 13:50:32 compute-0 nova_compute[181978]: 2026-01-12 13:50:32.015 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:32 compute-0 nova_compute[181978]: 2026-01-12 13:50:32.018 181991 DEBUG nova.compute.manager [None req-d15d7a6f-5fb1-476a-9d8c-6dd93167b933 - - - - - -] [instance: 6e6567cb-2108-4fae-9bf9-0626331454b8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:50:32 compute-0 nova_compute[181978]: 2026-01-12 13:50:32.253 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.401 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.402 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.414 181991 DEBUG nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.468 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.469 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.475 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.475 181991 INFO nova.compute.claims [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.588 181991 DEBUG nova.compute.provider_tree [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.602 181991 DEBUG nova.scheduler.client.report [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.618 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.618 181991 DEBUG nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.654 181991 DEBUG nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.654 181991 DEBUG nova.network.neutron [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.665 181991 INFO nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.674 181991 DEBUG nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.726 181991 DEBUG nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.729 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.729 181991 INFO nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Creating image(s)
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.730 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.730 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.732 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.743 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.799 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.800 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.800 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.809 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.860 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.861 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.888 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.890 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.890 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.940 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.941 181991 DEBUG nova.virt.disk.api [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.941 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.992 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.993 181991 DEBUG nova.virt.disk.api [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:50:34 compute-0 nova_compute[181978]: 2026-01-12 13:50:34.993 181991 DEBUG nova.objects.instance [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid 34997797-aac3-4d72-92c4-a51ee249bd90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:50:35 compute-0 nova_compute[181978]: 2026-01-12 13:50:35.008 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:50:35 compute-0 nova_compute[181978]: 2026-01-12 13:50:35.008 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Ensure instance console log exists: /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:50:35 compute-0 nova_compute[181978]: 2026-01-12 13:50:35.009 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:35 compute-0 nova_compute[181978]: 2026-01-12 13:50:35.009 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:35 compute-0 nova_compute[181978]: 2026-01-12 13:50:35.009 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:35 compute-0 nova_compute[181978]: 2026-01-12 13:50:35.318 181991 DEBUG nova.policy [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:50:36 compute-0 nova_compute[181978]: 2026-01-12 13:50:36.292 181991 DEBUG nova.network.neutron [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Successfully created port: 8a8799e1-01b3-47f7-85ad-0355618411ff _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.017 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.254 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.307 181991 DEBUG nova.network.neutron [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Successfully updated port: 8a8799e1-01b3-47f7-85ad-0355618411ff _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.321 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.321 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.321 181991 DEBUG nova.network.neutron [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.378 181991 DEBUG nova.compute.manager [req-4c2e0380-db0d-4512-8104-7469e4787654 req-db3edeeb-7a63-43a7-a6df-3753778c3d7c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.378 181991 DEBUG nova.compute.manager [req-4c2e0380-db0d-4512-8104-7469e4787654 req-db3edeeb-7a63-43a7-a6df-3753778c3d7c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing instance network info cache due to event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.378 181991 DEBUG oslo_concurrency.lockutils [req-4c2e0380-db0d-4512-8104-7469e4787654 req-db3edeeb-7a63-43a7-a6df-3753778c3d7c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:50:37 compute-0 nova_compute[181978]: 2026-01-12 13:50:37.430 181991 DEBUG nova.network.neutron [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:50:37 compute-0 podman[213500]: 2026-01-12 13:50:37.551694555 +0000 UTC m=+0.042073327 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.317 181991 DEBUG nova.network.neutron [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updating instance_info_cache with network_info: [{"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.333 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.334 181991 DEBUG nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Instance network_info: |[{"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.334 181991 DEBUG oslo_concurrency.lockutils [req-4c2e0380-db0d-4512-8104-7469e4787654 req-db3edeeb-7a63-43a7-a6df-3753778c3d7c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.334 181991 DEBUG nova.network.neutron [req-4c2e0380-db0d-4512-8104-7469e4787654 req-db3edeeb-7a63-43a7-a6df-3753778c3d7c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing network info cache for port 8a8799e1-01b3-47f7-85ad-0355618411ff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.336 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Start _get_guest_xml network_info=[{"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.352 181991 WARNING nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.356 181991 DEBUG nova.virt.libvirt.host [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.357 181991 DEBUG nova.virt.libvirt.host [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.359 181991 DEBUG nova.virt.libvirt.host [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.359 181991 DEBUG nova.virt.libvirt.host [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.360 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.360 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.360 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.360 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.360 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.360 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.361 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.361 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.361 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.361 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.361 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.361 181991 DEBUG nova.virt.hardware [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.364 181991 DEBUG nova.virt.libvirt.vif [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:50:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2101764621',display_name='tempest-TestNetworkBasicOps-server-2101764621',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2101764621',id=11,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDTXY+5Js3/qGeOTbyU7Fp6Taau6MqKO6ctOdJh5uqBdeMuch+L0KSDhlREpRkZyft2pzNm0v0itN3ZI81fO8b75kt4yFKgyIMPxameJ8QaycUc+5JlXnAwQIngaPGZ2bQ==',key_name='tempest-TestNetworkBasicOps-23131883',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-ypsnzhb4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:50:34Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=34997797-aac3-4d72-92c4-a51ee249bd90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.364 181991 DEBUG nova.network.os_vif_util [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.365 181991 DEBUG nova.network.os_vif_util [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:3f:05,bridge_name='br-int',has_traffic_filtering=True,id=8a8799e1-01b3-47f7-85ad-0355618411ff,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8799e1-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.366 181991 DEBUG nova.objects.instance [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid 34997797-aac3-4d72-92c4-a51ee249bd90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.373 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <uuid>34997797-aac3-4d72-92c4-a51ee249bd90</uuid>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <name>instance-0000000b</name>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-2101764621</nova:name>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:50:38</nova:creationTime>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:50:38 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:50:38 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:50:38 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:50:38 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:50:38 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:50:38 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:50:38 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:50:38 compute-0 nova_compute[181978]:         <nova:port uuid="8a8799e1-01b3-47f7-85ad-0355618411ff">
Jan 12 13:50:38 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <system>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <entry name="serial">34997797-aac3-4d72-92c4-a51ee249bd90</entry>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <entry name="uuid">34997797-aac3-4d72-92c4-a51ee249bd90</entry>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     </system>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <os>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   </os>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <features>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   </features>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk.config"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:f7:3f:05"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <target dev="tap8a8799e1-01"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/console.log" append="off"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <video>
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     </video>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:50:38 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:50:38 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:50:38 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:50:38 compute-0 nova_compute[181978]: </domain>
Jan 12 13:50:38 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.374 181991 DEBUG nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Preparing to wait for external event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.375 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.375 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.375 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.376 181991 DEBUG nova.virt.libvirt.vif [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:50:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2101764621',display_name='tempest-TestNetworkBasicOps-server-2101764621',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2101764621',id=11,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDTXY+5Js3/qGeOTbyU7Fp6Taau6MqKO6ctOdJh5uqBdeMuch+L0KSDhlREpRkZyft2pzNm0v0itN3ZI81fO8b75kt4yFKgyIMPxameJ8QaycUc+5JlXnAwQIngaPGZ2bQ==',key_name='tempest-TestNetworkBasicOps-23131883',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-ypsnzhb4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:50:34Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=34997797-aac3-4d72-92c4-a51ee249bd90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.376 181991 DEBUG nova.network.os_vif_util [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.376 181991 DEBUG nova.network.os_vif_util [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:3f:05,bridge_name='br-int',has_traffic_filtering=True,id=8a8799e1-01b3-47f7-85ad-0355618411ff,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8799e1-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.377 181991 DEBUG os_vif [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:3f:05,bridge_name='br-int',has_traffic_filtering=True,id=8a8799e1-01b3-47f7-85ad-0355618411ff,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8799e1-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.377 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.377 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.378 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.379 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.379 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a8799e1-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.380 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8a8799e1-01, col_values=(('external_ids', {'iface-id': '8a8799e1-01b3-47f7-85ad-0355618411ff', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:3f:05', 'vm-uuid': '34997797-aac3-4d72-92c4-a51ee249bd90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.381 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:38 compute-0 NetworkManager[55211]: <info>  [1768225838.3825] manager: (tap8a8799e1-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.384 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.385 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.386 181991 INFO os_vif [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:3f:05,bridge_name='br-int',has_traffic_filtering=True,id=8a8799e1-01b3-47f7-85ad-0355618411ff,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8799e1-01')
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.413 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.413 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.413 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:f7:3f:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.414 181991 INFO nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Using config drive
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.604 181991 INFO nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Creating config drive at /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk.config
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.608 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnvm0pa_f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.726 181991 DEBUG oslo_concurrency.processutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnvm0pa_f" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:38 compute-0 kernel: tap8a8799e1-01: entered promiscuous mode
Jan 12 13:50:38 compute-0 NetworkManager[55211]: <info>  [1768225838.7792] manager: (tap8a8799e1-01): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Jan 12 13:50:38 compute-0 ovn_controller[94974]: 2026-01-12T13:50:38Z|00148|binding|INFO|Claiming lport 8a8799e1-01b3-47f7-85ad-0355618411ff for this chassis.
Jan 12 13:50:38 compute-0 ovn_controller[94974]: 2026-01-12T13:50:38Z|00149|binding|INFO|8a8799e1-01b3-47f7-85ad-0355618411ff: Claiming fa:16:3e:f7:3f:05 10.100.0.4
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.781 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.797 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:3f:05 10.100.0.4'], port_security=['fa:16:3e:f7:3f:05 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dcf23cac-c128-431a-843f-97e2cc14c1fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a028035-7f0e-4424-9a98-a3f388afa919, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=8a8799e1-01b3-47f7-85ad-0355618411ff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.798 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 8a8799e1-01b3-47f7-85ad-0355618411ff in datapath dcfaf59f-c145-4ceb-8579-9f58575d161f bound to our chassis
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.799 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dcfaf59f-c145-4ceb-8579-9f58575d161f
Jan 12 13:50:38 compute-0 systemd-udevd[213541]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.808 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1218ec63-551b-444b-818d-b9434ab445e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.810 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdcfaf59f-c1 in ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.812 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdcfaf59f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.812 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[923a63ff-400f-4f88-a116-902a6c1ecae2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.812 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[812a3477-f1d3-45fe-b9ea-bf149e4e56e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 systemd-machined[153581]: New machine qemu-11-instance-0000000b.
Jan 12 13:50:38 compute-0 NetworkManager[55211]: <info>  [1768225838.8234] device (tap8a8799e1-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:50:38 compute-0 NetworkManager[55211]: <info>  [1768225838.8241] device (tap8a8799e1-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.827 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[6e03c67f-02dc-4788-a3c2-a922dc061805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.843 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1a89a136-5d34-4ad1-98ce-5803d30b1b1d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.844 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:38 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Jan 12 13:50:38 compute-0 ovn_controller[94974]: 2026-01-12T13:50:38Z|00150|binding|INFO|Setting lport 8a8799e1-01b3-47f7-85ad-0355618411ff ovn-installed in OVS
Jan 12 13:50:38 compute-0 ovn_controller[94974]: 2026-01-12T13:50:38Z|00151|binding|INFO|Setting lport 8a8799e1-01b3-47f7-85ad-0355618411ff up in Southbound
Jan 12 13:50:38 compute-0 nova_compute[181978]: 2026-01-12 13:50:38.851 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.881 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[4395199c-cfac-4412-bcda-0558d6f1cced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.894 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce9f989-48b3-4454-bff8-f71a722d5061]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 NetworkManager[55211]: <info>  [1768225838.8999] manager: (tapdcfaf59f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.939 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[0c72cc92-3862-4d6e-9bd1-1b988a682f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.946 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[e595e5ac-6a36-41eb-954f-99edab8f30e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 NetworkManager[55211]: <info>  [1768225838.9634] device (tapdcfaf59f-c0): carrier: link connected
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.967 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[14bdf8f1-f451-4cf7-8774-628b7888543a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.981 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3907fc0a-93fc-44d6-a8d6-f8cc8cb3cf54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcfaf59f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:bc:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 290499, 'reachable_time': 25042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213565, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:38 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:38.992 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[bd502f77-d263-4415-b4b3-fc53dc7470de]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feee:bc5e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 290499, 'tstamp': 290499}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213566, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.003 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff0a806-6811-4ddf-ae3c-87f839ac9e4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcfaf59f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:bc:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 290499, 'reachable_time': 25042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213567, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.022 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[d9502d20-7101-4e93-b4e0-dc46299bea4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.059 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[7344f3d6-abff-4dbe-a098-7f46d4e9718b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.060 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcfaf59f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.060 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.061 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcfaf59f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:39 compute-0 kernel: tapdcfaf59f-c0: entered promiscuous mode
Jan 12 13:50:39 compute-0 NetworkManager[55211]: <info>  [1768225839.0630] manager: (tapdcfaf59f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.066 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.070 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdcfaf59f-c0, col_values=(('external_ids', {'iface-id': '614d27be-df64-4b64-b0c6-45ad7c79ede1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:39 compute-0 ovn_controller[94974]: 2026-01-12T13:50:39Z|00152|binding|INFO|Releasing lport 614d27be-df64-4b64-b0c6-45ad7c79ede1 from this chassis (sb_readonly=0)
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.079 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcfaf59f-c145-4ceb-8579-9f58575d161f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcfaf59f-c145-4ceb-8579-9f58575d161f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.071 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.083 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[daf902f8-9ab6-4685-a4cc-37ba454d43e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.085 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-dcfaf59f-c145-4ceb-8579-9f58575d161f
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/dcfaf59f-c145-4ceb-8579-9f58575d161f.pid.haproxy
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID dcfaf59f-c145-4ceb-8579-9f58575d161f
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:50:39 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:39.087 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'env', 'PROCESS_TAG=haproxy-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dcfaf59f-c145-4ceb-8579-9f58575d161f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.283 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225839.2825863, 34997797-aac3-4d72-92c4-a51ee249bd90 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.284 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] VM Started (Lifecycle Event)
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.305 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.308 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225839.2828195, 34997797-aac3-4d72-92c4-a51ee249bd90 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.308 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] VM Paused (Lifecycle Event)
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.319 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.322 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.335 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:50:39 compute-0 podman[213603]: 2026-01-12 13:50:39.401638933 +0000 UTC m=+0.032011067 container create 02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:50:39 compute-0 systemd[1]: Started libpod-conmon-02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d.scope.
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.447 181991 DEBUG nova.compute.manager [req-7bb9343c-7e4f-4813-b2cd-6926196361a4 req-a694ecc0-f474-4937-9640-5320195cd2b5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.447 181991 DEBUG oslo_concurrency.lockutils [req-7bb9343c-7e4f-4813-b2cd-6926196361a4 req-a694ecc0-f474-4937-9640-5320195cd2b5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.448 181991 DEBUG oslo_concurrency.lockutils [req-7bb9343c-7e4f-4813-b2cd-6926196361a4 req-a694ecc0-f474-4937-9640-5320195cd2b5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.448 181991 DEBUG oslo_concurrency.lockutils [req-7bb9343c-7e4f-4813-b2cd-6926196361a4 req-a694ecc0-f474-4937-9640-5320195cd2b5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.448 181991 DEBUG nova.compute.manager [req-7bb9343c-7e4f-4813-b2cd-6926196361a4 req-a694ecc0-f474-4937-9640-5320195cd2b5 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Processing event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.449 181991 DEBUG nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:50:39 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.451 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225839.4516623, 34997797-aac3-4d72-92c4-a51ee249bd90 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.452 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] VM Resumed (Lifecycle Event)
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.454 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:50:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b890c2b94c9c18807c4d6c4286160e1f219ed65f166d6803cdceeb894f787daf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.457 181991 INFO nova.virt.libvirt.driver [-] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Instance spawned successfully.
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.457 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:50:39 compute-0 podman[213603]: 2026-01-12 13:50:39.464343349 +0000 UTC m=+0.094715493 container init 02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:50:39 compute-0 podman[213603]: 2026-01-12 13:50:39.469627354 +0000 UTC m=+0.099999488 container start 02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 12 13:50:39 compute-0 podman[213603]: 2026-01-12 13:50:39.387234741 +0000 UTC m=+0.017606895 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.473 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.475 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.479 181991 DEBUG nova.network.neutron [req-4c2e0380-db0d-4512-8104-7469e4787654 req-db3edeeb-7a63-43a7-a6df-3753778c3d7c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updated VIF entry in instance network info cache for port 8a8799e1-01b3-47f7-85ad-0355618411ff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.479 181991 DEBUG nova.network.neutron [req-4c2e0380-db0d-4512-8104-7469e4787654 req-db3edeeb-7a63-43a7-a6df-3753778c3d7c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updating instance_info_cache with network_info: [{"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.488 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.488 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.489 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.489 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.490 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.490 181991 DEBUG nova.virt.libvirt.driver [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:39 compute-0 neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f[213616]: [NOTICE]   (213620) : New worker (213622) forked
Jan 12 13:50:39 compute-0 neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f[213616]: [NOTICE]   (213620) : Loading success.
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.494 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.494 181991 DEBUG oslo_concurrency.lockutils [req-4c2e0380-db0d-4512-8104-7469e4787654 req-db3edeeb-7a63-43a7-a6df-3753778c3d7c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.534 181991 INFO nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Took 4.81 seconds to spawn the instance on the hypervisor.
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.534 181991 DEBUG nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.581 181991 INFO nova.compute.manager [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Took 5.14 seconds to build instance.
Jan 12 13:50:39 compute-0 nova_compute[181978]: 2026-01-12 13:50:39.592 181991 DEBUG oslo_concurrency.lockutils [None req-97409b3a-649e-44f8-9811-e96e490fc8cd d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:40.204 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:40.205 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:40.205 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:40 compute-0 podman[213627]: 2026-01-12 13:50:40.557525999 +0000 UTC m=+0.047817997 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 12 13:50:41 compute-0 nova_compute[181978]: 2026-01-12 13:50:41.517 181991 DEBUG nova.compute.manager [req-21d13b4e-78a9-4f1a-9b56-65e9d7131722 req-0f959ff2-fac6-4ba9-bfe3-ba9c605f069f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:41 compute-0 nova_compute[181978]: 2026-01-12 13:50:41.518 181991 DEBUG oslo_concurrency.lockutils [req-21d13b4e-78a9-4f1a-9b56-65e9d7131722 req-0f959ff2-fac6-4ba9-bfe3-ba9c605f069f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:41 compute-0 nova_compute[181978]: 2026-01-12 13:50:41.518 181991 DEBUG oslo_concurrency.lockutils [req-21d13b4e-78a9-4f1a-9b56-65e9d7131722 req-0f959ff2-fac6-4ba9-bfe3-ba9c605f069f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:41 compute-0 nova_compute[181978]: 2026-01-12 13:50:41.519 181991 DEBUG oslo_concurrency.lockutils [req-21d13b4e-78a9-4f1a-9b56-65e9d7131722 req-0f959ff2-fac6-4ba9-bfe3-ba9c605f069f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:41 compute-0 nova_compute[181978]: 2026-01-12 13:50:41.519 181991 DEBUG nova.compute.manager [req-21d13b4e-78a9-4f1a-9b56-65e9d7131722 req-0f959ff2-fac6-4ba9-bfe3-ba9c605f069f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] No waiting events found dispatching network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:50:41 compute-0 nova_compute[181978]: 2026-01-12 13:50:41.519 181991 WARNING nova.compute.manager [req-21d13b4e-78a9-4f1a-9b56-65e9d7131722 req-0f959ff2-fac6-4ba9-bfe3-ba9c605f069f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received unexpected event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff for instance with vm_state active and task_state None.
Jan 12 13:50:42 compute-0 nova_compute[181978]: 2026-01-12 13:50:42.257 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:42 compute-0 ovn_controller[94974]: 2026-01-12T13:50:42Z|00153|binding|INFO|Releasing lport 614d27be-df64-4b64-b0c6-45ad7c79ede1 from this chassis (sb_readonly=0)
Jan 12 13:50:42 compute-0 nova_compute[181978]: 2026-01-12 13:50:42.832 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:42 compute-0 NetworkManager[55211]: <info>  [1768225842.8325] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 12 13:50:42 compute-0 NetworkManager[55211]: <info>  [1768225842.8334] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 12 13:50:42 compute-0 ovn_controller[94974]: 2026-01-12T13:50:42Z|00154|binding|INFO|Releasing lport 614d27be-df64-4b64-b0c6-45ad7c79ede1 from this chassis (sb_readonly=0)
Jan 12 13:50:42 compute-0 nova_compute[181978]: 2026-01-12 13:50:42.864 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:42 compute-0 nova_compute[181978]: 2026-01-12 13:50:42.866 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.112 181991 DEBUG nova.compute.manager [req-3d117271-806e-46e6-8780-27279840a4ae req-514a963b-608d-4a85-967c-a968c92b3f0b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.113 181991 DEBUG nova.compute.manager [req-3d117271-806e-46e6-8780-27279840a4ae req-514a963b-608d-4a85-967c-a968c92b3f0b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing instance network info cache due to event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.114 181991 DEBUG oslo_concurrency.lockutils [req-3d117271-806e-46e6-8780-27279840a4ae req-514a963b-608d-4a85-967c-a968c92b3f0b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.114 181991 DEBUG oslo_concurrency.lockutils [req-3d117271-806e-46e6-8780-27279840a4ae req-514a963b-608d-4a85-967c-a968c92b3f0b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.115 181991 DEBUG nova.network.neutron [req-3d117271-806e-46e6-8780-27279840a4ae req-514a963b-608d-4a85-967c-a968c92b3f0b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing network info cache for port 8a8799e1-01b3-47f7-85ad-0355618411ff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.190 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.205 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Triggering sync for uuid 34997797-aac3-4d72-92c4-a51ee249bd90 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.206 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.206 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "34997797-aac3-4d72-92c4-a51ee249bd90" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.237 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "34997797-aac3-4d72-92c4-a51ee249bd90" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.382 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.840 181991 DEBUG nova.network.neutron [req-3d117271-806e-46e6-8780-27279840a4ae req-514a963b-608d-4a85-967c-a968c92b3f0b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updated VIF entry in instance network info cache for port 8a8799e1-01b3-47f7-85ad-0355618411ff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.841 181991 DEBUG nova.network.neutron [req-3d117271-806e-46e6-8780-27279840a4ae req-514a963b-608d-4a85-967c-a968c92b3f0b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updating instance_info_cache with network_info: [{"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:50:43 compute-0 nova_compute[181978]: 2026-01-12 13:50:43.855 181991 DEBUG oslo_concurrency.lockutils [req-3d117271-806e-46e6-8780-27279840a4ae req-514a963b-608d-4a85-967c-a968c92b3f0b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:50:46 compute-0 nova_compute[181978]: 2026-01-12 13:50:46.818 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:46 compute-0 nova_compute[181978]: 2026-01-12 13:50:46.820 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:46 compute-0 nova_compute[181978]: 2026-01-12 13:50:46.832 181991 DEBUG nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:50:46 compute-0 nova_compute[181978]: 2026-01-12 13:50:46.900 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:46 compute-0 nova_compute[181978]: 2026-01-12 13:50:46.901 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:46 compute-0 nova_compute[181978]: 2026-01-12 13:50:46.908 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:50:46 compute-0 nova_compute[181978]: 2026-01-12 13:50:46.908 181991 INFO nova.compute.claims [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.005 181991 DEBUG nova.compute.provider_tree [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.015 181991 DEBUG nova.scheduler.client.report [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.030 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.030 181991 DEBUG nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.062 181991 DEBUG nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.062 181991 DEBUG nova.network.neutron [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.072 181991 INFO nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.084 181991 DEBUG nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.150 181991 DEBUG nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.151 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.151 181991 INFO nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Creating image(s)
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.152 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.152 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.153 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.163 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.229 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.232 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.232 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.242 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.260 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.304 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.305 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.321 181991 DEBUG nova.policy [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.328 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.331 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.331 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.388 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.389 181991 DEBUG nova.virt.disk.api [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.390 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.449 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.450 181991 DEBUG nova.virt.disk.api [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.450 181991 DEBUG nova.objects.instance [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.464 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.464 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Ensure instance console log exists: /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.464 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.465 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:47 compute-0 nova_compute[181978]: 2026-01-12 13:50:47.465 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.318 181991 DEBUG nova.network.neutron [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Successfully created port: 028fd111-a615-4adf-a755-cf7fe0f5b0d6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.385 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.780 181991 DEBUG nova.network.neutron [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Successfully updated port: 028fd111-a615-4adf-a755-cf7fe0f5b0d6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.794 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.794 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.795 181991 DEBUG nova.network.neutron [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.850 181991 DEBUG nova.compute.manager [req-96c7527f-b4c1-48d9-819b-a8be88ca8d55 req-0e44a84a-4c2c-47f3-b0ef-b346f6e28a40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received event network-changed-028fd111-a615-4adf-a755-cf7fe0f5b0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.851 181991 DEBUG nova.compute.manager [req-96c7527f-b4c1-48d9-819b-a8be88ca8d55 req-0e44a84a-4c2c-47f3-b0ef-b346f6e28a40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Refreshing instance network info cache due to event network-changed-028fd111-a615-4adf-a755-cf7fe0f5b0d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.851 181991 DEBUG oslo_concurrency.lockutils [req-96c7527f-b4c1-48d9-819b-a8be88ca8d55 req-0e44a84a-4c2c-47f3-b0ef-b346f6e28a40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:50:48 compute-0 nova_compute[181978]: 2026-01-12 13:50:48.894 181991 DEBUG nova.network.neutron [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.443 181991 DEBUG nova.network.neutron [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Updating instance_info_cache with network_info: [{"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.461 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.461 181991 DEBUG nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Instance network_info: |[{"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.462 181991 DEBUG oslo_concurrency.lockutils [req-96c7527f-b4c1-48d9-819b-a8be88ca8d55 req-0e44a84a-4c2c-47f3-b0ef-b346f6e28a40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.462 181991 DEBUG nova.network.neutron [req-96c7527f-b4c1-48d9-819b-a8be88ca8d55 req-0e44a84a-4c2c-47f3-b0ef-b346f6e28a40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Refreshing network info cache for port 028fd111-a615-4adf-a755-cf7fe0f5b0d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.465 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Start _get_guest_xml network_info=[{"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.469 181991 WARNING nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.476 181991 DEBUG nova.virt.libvirt.host [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.477 181991 DEBUG nova.virt.libvirt.host [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.480 181991 DEBUG nova.virt.libvirt.host [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.480 181991 DEBUG nova.virt.libvirt.host [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.481 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.481 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.481 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.481 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.482 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.482 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.482 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.482 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.482 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.483 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.483 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.483 181991 DEBUG nova.virt.hardware [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.486 181991 DEBUG nova.virt.libvirt.vif [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:50:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1065387734',display_name='tempest-TestNetworkBasicOps-server-1065387734',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1065387734',id=12,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBObNdm8G3Bb3woErIsQi8zn1PlnkwijV+u0/zGRytOK7GZmUmhXDiKgHezfs87bk9oMJ+2hnBkZK0mYYN+55I9pE+wWpOHP6C/l1lwen1+dU8yrkChT77LnAkMSueXD8Ew==',key_name='tempest-TestNetworkBasicOps-1687337889',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-xhwravzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:50:47Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.486 181991 DEBUG nova.network.os_vif_util [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.487 181991 DEBUG nova.network.os_vif_util [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:f1:d5,bridge_name='br-int',has_traffic_filtering=True,id=028fd111-a615-4adf-a755-cf7fe0f5b0d6,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap028fd111-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.488 181991 DEBUG nova.objects.instance [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.498 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <uuid>b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5</uuid>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <name>instance-0000000c</name>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-1065387734</nova:name>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:50:49</nova:creationTime>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:50:49 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:50:49 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:50:49 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:50:49 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:50:49 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:50:49 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:50:49 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:50:49 compute-0 nova_compute[181978]:         <nova:port uuid="028fd111-a615-4adf-a755-cf7fe0f5b0d6">
Jan 12 13:50:49 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <system>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <entry name="serial">b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5</entry>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <entry name="uuid">b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5</entry>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     </system>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <os>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   </os>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <features>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   </features>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.config"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:db:f1:d5"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <target dev="tap028fd111-a6"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/console.log" append="off"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <video>
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     </video>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:50:49 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:50:49 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:50:49 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:50:49 compute-0 nova_compute[181978]: </domain>
Jan 12 13:50:49 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.499 181991 DEBUG nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Preparing to wait for external event network-vif-plugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.499 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.500 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.500 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.500 181991 DEBUG nova.virt.libvirt.vif [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:50:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1065387734',display_name='tempest-TestNetworkBasicOps-server-1065387734',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1065387734',id=12,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBObNdm8G3Bb3woErIsQi8zn1PlnkwijV+u0/zGRytOK7GZmUmhXDiKgHezfs87bk9oMJ+2hnBkZK0mYYN+55I9pE+wWpOHP6C/l1lwen1+dU8yrkChT77LnAkMSueXD8Ew==',key_name='tempest-TestNetworkBasicOps-1687337889',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-xhwravzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:50:47Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.501 181991 DEBUG nova.network.os_vif_util [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.501 181991 DEBUG nova.network.os_vif_util [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:f1:d5,bridge_name='br-int',has_traffic_filtering=True,id=028fd111-a615-4adf-a755-cf7fe0f5b0d6,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap028fd111-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.501 181991 DEBUG os_vif [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:f1:d5,bridge_name='br-int',has_traffic_filtering=True,id=028fd111-a615-4adf-a755-cf7fe0f5b0d6,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap028fd111-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.502 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.502 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.503 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.505 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.505 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap028fd111-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.506 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap028fd111-a6, col_values=(('external_ids', {'iface-id': '028fd111-a615-4adf-a755-cf7fe0f5b0d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:f1:d5', 'vm-uuid': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.507 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:49 compute-0 NetworkManager[55211]: <info>  [1768225849.5079] manager: (tap028fd111-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.509 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.515 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.516 181991 INFO os_vif [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:f1:d5,bridge_name='br-int',has_traffic_filtering=True,id=028fd111-a615-4adf-a755-cf7fe0f5b0d6,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap028fd111-a6')
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.551 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.551 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.552 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:db:f1:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.552 181991 INFO nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Using config drive
Jan 12 13:50:49 compute-0 ovn_controller[94974]: 2026-01-12T13:50:49Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:3f:05 10.100.0.4
Jan 12 13:50:49 compute-0 ovn_controller[94974]: 2026-01-12T13:50:49Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:3f:05 10.100.0.4
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.761 181991 INFO nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Creating config drive at /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.config
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.769 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf64yl2yk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.895 181991 DEBUG oslo_concurrency.processutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf64yl2yk" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:50:49 compute-0 kernel: tap028fd111-a6: entered promiscuous mode
Jan 12 13:50:49 compute-0 NetworkManager[55211]: <info>  [1768225849.9553] manager: (tap028fd111-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.958 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:49 compute-0 ovn_controller[94974]: 2026-01-12T13:50:49Z|00155|binding|INFO|Claiming lport 028fd111-a615-4adf-a755-cf7fe0f5b0d6 for this chassis.
Jan 12 13:50:49 compute-0 ovn_controller[94974]: 2026-01-12T13:50:49Z|00156|binding|INFO|028fd111-a615-4adf-a755-cf7fe0f5b0d6: Claiming fa:16:3e:db:f1:d5 10.100.0.9
Jan 12 13:50:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:49.964 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:f1:d5 10.100.0.9'], port_security=['fa:16:3e:db:f1:d5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '13b83140-3702-4853-ba8b-cdde31476ed3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a028035-7f0e-4424-9a98-a3f388afa919, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=028fd111-a615-4adf-a755-cf7fe0f5b0d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:50:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:49.965 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 028fd111-a615-4adf-a755-cf7fe0f5b0d6 in datapath dcfaf59f-c145-4ceb-8579-9f58575d161f bound to our chassis
Jan 12 13:50:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:49.965 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dcfaf59f-c145-4ceb-8579-9f58575d161f
Jan 12 13:50:49 compute-0 ovn_controller[94974]: 2026-01-12T13:50:49Z|00157|binding|INFO|Setting lport 028fd111-a615-4adf-a755-cf7fe0f5b0d6 ovn-installed in OVS
Jan 12 13:50:49 compute-0 ovn_controller[94974]: 2026-01-12T13:50:49Z|00158|binding|INFO|Setting lport 028fd111-a615-4adf-a755-cf7fe0f5b0d6 up in Southbound
Jan 12 13:50:49 compute-0 nova_compute[181978]: 2026-01-12 13:50:49.989 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:49 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:49.996 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb07093-c080-4916-bcb3-ad6333671c3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:50 compute-0 systemd-machined[153581]: New machine qemu-12-instance-0000000c.
Jan 12 13:50:50 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Jan 12 13:50:50 compute-0 systemd-udevd[213716]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.041 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[45e7733d-2586-4e5a-ac17-075c0836de76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.044 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[8d042c41-45ee-4517-a28c-f7bfe21c29a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:50 compute-0 NetworkManager[55211]: <info>  [1768225850.0547] device (tap028fd111-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:50:50 compute-0 NetworkManager[55211]: <info>  [1768225850.0554] device (tap028fd111-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.074 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[334b8428-6f86-47ea-b33f-ec39f1b2427e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:50 compute-0 podman[213680]: 2026-01-12 13:50:50.099752445 +0000 UTC m=+0.146515187 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.108 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[72f361d4-905d-4a29-972d-60f6620dc0f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcfaf59f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:bc:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 290499, 'reachable_time': 25042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213755, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:50 compute-0 podman[213679]: 2026-01-12 13:50:50.117727751 +0000 UTC m=+0.150524825 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:50:50 compute-0 podman[213681]: 2026-01-12 13:50:50.117637281 +0000 UTC m=+0.153059411 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=)
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.127 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae5e166-7ba8-43be-9c2e-038d44a8a048]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdcfaf59f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 290506, 'tstamp': 290506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213763, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdcfaf59f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 290508, 'tstamp': 290508}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213763, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.129 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcfaf59f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.131 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.132 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcfaf59f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.132 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.133 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdcfaf59f-c0, col_values=(('external_ids', {'iface-id': '614d27be-df64-4b64-b0c6-45ad7c79ede1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:50:50 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:50:50.133 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.328 181991 DEBUG nova.network.neutron [req-96c7527f-b4c1-48d9-819b-a8be88ca8d55 req-0e44a84a-4c2c-47f3-b0ef-b346f6e28a40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Updated VIF entry in instance network info cache for port 028fd111-a615-4adf-a755-cf7fe0f5b0d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.332 181991 DEBUG nova.network.neutron [req-96c7527f-b4c1-48d9-819b-a8be88ca8d55 req-0e44a84a-4c2c-47f3-b0ef-b346f6e28a40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Updating instance_info_cache with network_info: [{"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.347 181991 DEBUG oslo_concurrency.lockutils [req-96c7527f-b4c1-48d9-819b-a8be88ca8d55 req-0e44a84a-4c2c-47f3-b0ef-b346f6e28a40 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.599 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225850.599273, b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.600 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] VM Started (Lifecycle Event)
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.618 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.620 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225850.5994608, b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.620 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] VM Paused (Lifecycle Event)
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.633 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.635 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.647 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.914 181991 DEBUG nova.compute.manager [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received event network-vif-plugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.914 181991 DEBUG oslo_concurrency.lockutils [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.914 181991 DEBUG oslo_concurrency.lockutils [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.914 181991 DEBUG oslo_concurrency.lockutils [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.915 181991 DEBUG nova.compute.manager [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Processing event network-vif-plugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.915 181991 DEBUG nova.compute.manager [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received event network-vif-plugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.915 181991 DEBUG oslo_concurrency.lockutils [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.915 181991 DEBUG oslo_concurrency.lockutils [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.915 181991 DEBUG oslo_concurrency.lockutils [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.915 181991 DEBUG nova.compute.manager [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] No waiting events found dispatching network-vif-plugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.916 181991 WARNING nova.compute.manager [req-53e4ee2f-43eb-4645-a960-c763df330942 req-5fc973e0-2771-438b-8351-16a0c645e352 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received unexpected event network-vif-plugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 for instance with vm_state building and task_state spawning.
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.916 181991 DEBUG nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.919 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225850.9184165, b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.919 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] VM Resumed (Lifecycle Event)
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.922 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.924 181991 INFO nova.virt.libvirt.driver [-] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Instance spawned successfully.
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.924 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.936 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.941 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.944 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.944 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.945 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.945 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.945 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.946 181991 DEBUG nova.virt.libvirt.driver [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.961 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.989 181991 INFO nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Took 3.84 seconds to spawn the instance on the hypervisor.
Jan 12 13:50:50 compute-0 nova_compute[181978]: 2026-01-12 13:50:50.989 181991 DEBUG nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:50:51 compute-0 nova_compute[181978]: 2026-01-12 13:50:51.030 181991 INFO nova.compute.manager [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Took 4.16 seconds to build instance.
Jan 12 13:50:51 compute-0 nova_compute[181978]: 2026-01-12 13:50:51.040 181991 DEBUG oslo_concurrency.lockutils [None req-9bf46cc5-592a-4e7b-82db-144481788195 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:50:52 compute-0 nova_compute[181978]: 2026-01-12 13:50:52.261 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:53 compute-0 nova_compute[181978]: 2026-01-12 13:50:53.596 181991 DEBUG nova.compute.manager [req-b01bfc04-d336-44e5-abe5-f5894feb1780 req-04f59222-35a7-42c5-b551-d34aa61f3a75 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received event network-changed-028fd111-a615-4adf-a755-cf7fe0f5b0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:50:53 compute-0 nova_compute[181978]: 2026-01-12 13:50:53.596 181991 DEBUG nova.compute.manager [req-b01bfc04-d336-44e5-abe5-f5894feb1780 req-04f59222-35a7-42c5-b551-d34aa61f3a75 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Refreshing instance network info cache due to event network-changed-028fd111-a615-4adf-a755-cf7fe0f5b0d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:50:53 compute-0 nova_compute[181978]: 2026-01-12 13:50:53.597 181991 DEBUG oslo_concurrency.lockutils [req-b01bfc04-d336-44e5-abe5-f5894feb1780 req-04f59222-35a7-42c5-b551-d34aa61f3a75 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:50:53 compute-0 nova_compute[181978]: 2026-01-12 13:50:53.597 181991 DEBUG oslo_concurrency.lockutils [req-b01bfc04-d336-44e5-abe5-f5894feb1780 req-04f59222-35a7-42c5-b551-d34aa61f3a75 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:50:53 compute-0 nova_compute[181978]: 2026-01-12 13:50:53.597 181991 DEBUG nova.network.neutron [req-b01bfc04-d336-44e5-abe5-f5894feb1780 req-04f59222-35a7-42c5-b551-d34aa61f3a75 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Refreshing network info cache for port 028fd111-a615-4adf-a755-cf7fe0f5b0d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:50:54 compute-0 nova_compute[181978]: 2026-01-12 13:50:54.491 181991 DEBUG nova.network.neutron [req-b01bfc04-d336-44e5-abe5-f5894feb1780 req-04f59222-35a7-42c5-b551-d34aa61f3a75 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Updated VIF entry in instance network info cache for port 028fd111-a615-4adf-a755-cf7fe0f5b0d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:50:54 compute-0 nova_compute[181978]: 2026-01-12 13:50:54.492 181991 DEBUG nova.network.neutron [req-b01bfc04-d336-44e5-abe5-f5894feb1780 req-04f59222-35a7-42c5-b551-d34aa61f3a75 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Updating instance_info_cache with network_info: [{"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:50:54 compute-0 nova_compute[181978]: 2026-01-12 13:50:54.507 181991 DEBUG oslo_concurrency.lockutils [req-b01bfc04-d336-44e5-abe5-f5894feb1780 req-04f59222-35a7-42c5-b551-d34aa61f3a75 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:50:54 compute-0 nova_compute[181978]: 2026-01-12 13:50:54.508 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.124 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'name': 'tempest-TestNetworkBasicOps-server-2101764621', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c978298f864c4039b47e09202eaf780c', 'user_id': 'd4158a3958504a578730a6b3561138ce', 'hostId': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.127 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'name': 'tempest-TestNetworkBasicOps-server-1065387734', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c978298f864c4039b47e09202eaf780c', 'user_id': 'd4158a3958504a578730a6b3561138ce', 'hostId': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.129 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 34997797-aac3-4d72-92c4-a51ee249bd90 / tap8a8799e1-01 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.130 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.132 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5 / tap028fd111-a6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.132 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6f37141-9ad8-4618-a6c1-07834b52ace1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.127822', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b7482e78-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': '8a06c74ef2d33b2716e681a159f6d4bb46560a2d58c4036f627b4d2362feba39'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.127822', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b7488f4e-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': 'eabe316cc47e0e1287a907a1d890c6278c4c3a4f398c963da99b1162ef75f581'}]}, 'timestamp': '2026-01-12 13:50:55.132996', '_unique_id': 'a5ea03b1c507411f858084c7d93cded6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.133 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.134 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.outgoing.bytes volume: 1438 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.134 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f05a273b-5cd8-4eb3-b395-67785b662fcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1438, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.134732', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b748de36-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': '13413b521438f8bbe7aac6518777eb4303f9009cd70564fca424eed369b611c5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.134732', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b748e6b0-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': '43a67818ede153156303d51e8792f05955d4661b54e54775f6e1a6649c76b718'}]}, 'timestamp': '2026-01-12 13:50:55.135193', '_unique_id': 'df66558521954a54b832ad2ac1f0ca98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.135 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.136 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.136 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2101764621>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1065387734>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2101764621>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1065387734>]
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.154 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.write.requests volume: 311 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.154 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.173 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a9789ea-ba74-4237-ae08-43bb7fa45b39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 311, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-vda', 'timestamp': '2026-01-12T13:50:55.136565', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b74be568-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': 'beb16bbedb8387c169484fb71d770d26ca835f8a909da76e7b173cab5806eb60'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-sda', 'timestamp': '2026-01-12T13:50:55.136565', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b74bee6e-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': '3604af80d17f7a3c6c444462f5f83fccaa2aa968493d768464aad8e9e601ee02'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-vda', 'timestamp': '2026-01-12T13:50:55.136565', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b74ed2e6-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': '39ee29bf7deeaae9cc37886e877ded29eba796b0270e39d749390585f68f85e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-sda', 'timestamp': '2026-01-12T13:50:55.136565', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b74edcdc-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': '2e5c1f760b7a83036b22e152c434e28529e06265d061e32e9b5f24a21681389c'}]}, 'timestamp': '2026-01-12 13:50:55.174261', '_unique_id': 'c12120bdd13c44ffabe214ab26ba7a3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.174 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.175 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8168b5dc-0399-470f-89fb-87fd2ae02be2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.175760', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b74f2214-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': '2aa3397927e39fd2c10a342cb42cea7e05b95d4de1078da516d6f0718b41d25a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.175760', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b74f2bd8-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': 'afc4cbb7b44e1f2a630f56e9b1ea6fa71a44aee93f254b45a4a9376d14a2588b'}]}, 'timestamp': '2026-01-12 13:50:55.176286', '_unique_id': '8d9d0afcd096473e808f8b6f5a6cd7e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.176 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.178 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.178 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.178 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f87be989-7708-4942-8828-7ef908b43fe7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.178650', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b74f9244-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': 'ba7976cdb219f5aca075de667fc928c7dae82eecdc01bfcf3b9231d0890686be'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.178650', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b74f9ca8-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': '2e55ed71dc590d6bc7eda8d622a7ae8f0fa9438df369f67cd86ac0f04c297bc1'}]}, 'timestamp': '2026-01-12 13:50:55.179174', '_unique_id': '0fc6e7e30a8145a4898803a2ca5ececf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.179 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.180 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.read.requests volume: 1088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.180 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.181 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.181 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e538451-95bc-40bd-a635-fb13a8f1904a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1088, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-vda', 'timestamp': '2026-01-12T13:50:55.180566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b74fdd76-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': '306fbcfc624340b5fc920e0693026aa647567c578505315d8b954f624123e5c2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-sda', 'timestamp': '2026-01-12T13:50:55.180566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b74fe76c-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': 'c7df20e61134b0c402c22d3100e0f5c4feaecf5b77271632d1f354d6cc87cb24'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-vda', 'timestamp': '2026-01-12T13:50:55.180566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b74ff036-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': '52c27236db4aa0152977431d4b5bb8dd710d3b148e22aadad4c44484907d5528'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-sda', 'timestamp': '2026-01-12T13:50:55.180566', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b74ff8b0-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': 'd331ee4d716f552b14356eaa03b6ee1709f5dbfc5a6f33392e1fbcccdb8d311b'}]}, 'timestamp': '2026-01-12 13:50:55.181522', '_unique_id': '06c108b63e9a4a80b705d33785033b90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.183 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.183 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2101764621>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1065387734>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2101764621>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1065387734>]
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.183 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.183 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.183 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b8cb25a-dcd3-431b-b7fc-fa042cb43c10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-vda', 'timestamp': '2026-01-12T13:50:55.183364', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b75049e6-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': 'bc83bf5dac51046c8259eec705b32e892d8465f954be221e85518e5ac54dd7ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-sda', 'timestamp': '2026-01-12T13:50:55.183364', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b7505364-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': 'd20b83cce9d91a6a9473607de0d44ca5b093c60f93e20844ef4a2e10b1807402'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-vda', 'timestamp': '2026-01-12T13:50:55.183364', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b7505cf6-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': 'b98065f6277d370ebd119192d716b06292c385813ef293955dcebc96d35bf420'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-sda', 'timestamp': '2026-01-12T13:50:55.183364', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b7506598-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': '3e6307873a20b599fa49fb3a3ba42974aa63a321b4d663de88b0fc2d46f9a101'}]}, 'timestamp': '2026-01-12 13:50:55.184310', '_unique_id': 'cd32dfbf4f3243adb51cac63f9acbbf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.184 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.185 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.196 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/cpu volume: 9730000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.207 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/cpu volume: 4070000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de415768-5777-478a-986b-17c0e543935a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9730000000, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'timestamp': '2026-01-12T13:50:55.185769', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b75251d2-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.263765185, 'message_signature': '884213a09ff4bc936c9c3b9df93c0d384852c15648f4dc2e7dc320a1a0123a66'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4070000000, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'timestamp': '2026-01-12T13:50:55.185769', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b753eee8-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.274338184, 'message_signature': 'b38debe12e0e215d5b60af8b7f5350b7423b9d96ef936e4b14ca00a42e7d6444'}]}, 'timestamp': '2026-01-12 13:50:55.207499', '_unique_id': 'b65206031d56437697f3e3679637b1b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.208 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd771d58-6d06-4157-b5df-ae26a847092a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.208961', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b754325e-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': '1e4fa4b74a6a07941eeedc875e9bf78f0d768dff11b70fa9ce4e46c537bc4a51'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.208961', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b7543c0e-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': 'e40fdf6829f1861e88330f53c18647f35f68bea09c4bd3adbbc61e6918d37605'}]}, 'timestamp': '2026-01-12 13:50:55.209471', '_unique_id': '788a35c2195e48cc9ebd03fb56e30645'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.209 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.210 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.211 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2101764621>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1065387734>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2101764621>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1065387734>]
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.211 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.211 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3b0ab84-7330-4960-add8-6aa4240c6eee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.211295', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b7548d1c-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': '619b50a8d729654ac14b66d50cb407452de058eec3dfe2ceb9f87d78f4c7d3b3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.211295', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b7549712-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': 'e7c566a8d2412ff6737dd82a60f2e9afd5d68f88644daa13dea55ed3afecb20f'}]}, 'timestamp': '2026-01-12 13:50:55.211800', '_unique_id': 'b47fb977c809426cb6a07f2e99029a40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.212 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.213 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.219 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.219 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.228 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.228 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '962891a4-30a4-4aef-9cb8-c08335a62b97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-vda', 'timestamp': '2026-01-12T13:50:55.213261', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b755d348-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.280432569, 'message_signature': '8c09761cd13104ce42a022eb36c9e52d451a0b345bf012afca2c45cb5291e984'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-sda', 'timestamp': '2026-01-12T13:50:55.213261', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b755de56-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.280432569, 'message_signature': 'f932f858b5c8c49a883ec18acf7c740e7fce6606b6c53949fa5dbb850e3c2421'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-vda', 'timestamp': '2026-01-12T13:50:55.213261', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b757252c-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.287323, 'message_signature': '98130ebb1779fe5876c3de14178bca3ea572ed7b097fb6ccaac74825e05582f0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-sda', 'timestamp': '2026-01-12T13:50:55.213261', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b7572f54-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.287323, 'message_signature': 'a879a1090a5a103f1b004b9f6d77d7205f0442ac2bb5bb683cdb92593214b574'}]}, 'timestamp': '2026-01-12 13:50:55.228800', '_unique_id': '1352d153867349fca6f3d4edc68c3aeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.229 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.230 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.read.latency volume: 187272470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.230 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.read.latency volume: 99336951 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.230 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.read.latency volume: 141862409 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.read.latency volume: 5672566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c2da417-a29b-48f9-8e98-98b82bd853f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187272470, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-vda', 'timestamp': '2026-01-12T13:50:55.230288', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b7577342-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': '02cf5427f48b06d09abfeae5f3839bc17530a4821e31fdba76121a9bb4fdc5bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 99336951, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-sda', 'timestamp': '2026-01-12T13:50:55.230288', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b7577cf2-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': 'd629e9234b0e8d0a1b6b44d3c1c2045d8795f8a991be3a131777ecf84486e2fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 141862409, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-vda', 'timestamp': '2026-01-12T13:50:55.230288', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b75786ac-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': 'e7d339448593767f95898200cac029bbdf3259876090580edd7cf0d2ec7ae836'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5672566, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-sda', 'timestamp': '2026-01-12T13:50:55.230288', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b7578f9e-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': 'bb940c87e15639c377ffda64359e7f3217c0efed9edbf0331b12e7da8304f1d6'}]}, 'timestamp': '2026-01-12 13:50:55.231261', '_unique_id': 'e148e1f0538342c889711d724279441a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.231 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.232 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.233 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.233 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.233 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '151a6c7c-2766-44e7-a135-61101b3253bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-vda', 'timestamp': '2026-01-12T13:50:55.232768', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b757d4f4-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.280432569, 'message_signature': '3c563e64c1cf3c08378ab307eadc6f30011201e10c6daaec7b17434a70e3daf1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-sda', 'timestamp': '2026-01-12T13:50:55.232768', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b757de4a-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.280432569, 'message_signature': '413eb997be3b2c23e12930a20df773e1d8f61fb0d067f2e035e0a3a427c13828'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-vda', 'timestamp': '2026-01-12T13:50:55.232768', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b757e6f6-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.287323, 'message_signature': '77a6ae03728bced7ad89df947065a03e39b2d3e2992f65b28f1fc3c13192c9e3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-sda', 'timestamp': '2026-01-12T13:50:55.232768', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b757efc0-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.287323, 'message_signature': 'd5781e106fc589a37f04608d07f605757c8a293fe47b15ceadc9d17c4b0f14a8'}]}, 'timestamp': '2026-01-12 13:50:55.233721', '_unique_id': 'afc5cb0d0f5444ef85cdac2770656095'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 rsyslogd[962]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.234 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.235 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.235 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf44e365-404d-4a08-95ed-c8d14c7e1ca9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.235203', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b7583336-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': 'a111ac623a3fab7a8355f60f1960309232e12cabdc8d990fd9ccfe3605c3e655'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.235203', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b7583c96-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': 'e277ad20e4537b777b117626ba0e9d491130a3f3652917dcb70ffd98c0c9d0dd'}]}, 'timestamp': '2026-01-12 13:50:55.235707', '_unique_id': 'f84f133639044356af3a1143b80e1470'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.236 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.237 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.237 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b16f4bc6-1562-4c33-a152-91b36f1d159a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.237115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b7587df0-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': 'bf0fec23d062f825d4f29438778ea6e55c0164d2bca4d8ca90a102fdab1e455b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.237115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b758876e-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': '3666aaf688a07de836a910e0e577c76b9d1d4e7344f8fa2ee0efdd5094cde872'}]}, 'timestamp': '2026-01-12 13:50:55.237623', '_unique_id': 'd01986b1e0de464eb31eee29f506a56f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e07437b1-fe47-4a5d-abfc-bac1d4643c5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.239020', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b758c846-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': 'e4883ea7bcd9b152316a2343c048587335b72fc60e931e5264fb5482c21451ad'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.239020', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b758d192-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': '52f0882a4493db3320850448b88f077986e809000f4150c8e86ce0cde23dfe3e'}]}, 'timestamp': '2026-01-12 13:50:55.239510', '_unique_id': 'ddca94a5801c48509163da46dd30f8a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.239 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.240 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.write.latency volume: 490213540 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.241 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.241 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.241 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc62dd24-5e1f-4bee-870c-ca7df2bc9341', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 490213540, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-vda', 'timestamp': '2026-01-12T13:50:55.240934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b759130a-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': '2f6b97f0a3c9858ab26507cfea6121760f14c789d65b6457e4f62f38cc743790'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-sda', 'timestamp': '2026-01-12T13:50:55.240934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b7591c06-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': '0512a0cc0c6ad5a0dda56eac451b28ff67aeeb2540deff93b409d3f3f1a3d8b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-vda', 'timestamp': '2026-01-12T13:50:55.240934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b759249e-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': 'fabcc0be0754503099715e74d7d1f95c07eb98953c89ea047bc7d4fe5ef20307'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-sda', 'timestamp': '2026-01-12T13:50:55.240934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b7592d86-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': '8463fce5302d5c8b03dca8e84a2cb35bfc55a0cb9e98f7a1862d35e169033bbb'}]}, 'timestamp': '2026-01-12 13:50:55.241856', '_unique_id': '72ff4305caa841369003bdc576c396cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.242 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.243 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.243 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2101764621>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1065387734>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2101764621>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1065387734>]
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.243 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.read.bytes volume: 30341632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.243 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.244 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.244 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd02fd516-c872-441e-ab7e-15f3ada1899a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30341632, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-vda', 'timestamp': '2026-01-12T13:50:55.243708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b7597f16-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': 'e04ce02f67e24974e9cb8b08c305517ee335d562c2ed5ffa3a436d59af216de6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-sda', 'timestamp': '2026-01-12T13:50:55.243708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b75988da-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.203735771, 'message_signature': '5ada888806e195821cea29462cc5c950b98962222e1a29d9f7741b1f6c457f2a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-vda', 'timestamp': '2026-01-12T13:50:55.243708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b7599168-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': '3ab890821a9d32f006f5e7d781328ec36ad14bb83314b5d061eb9625ab1b4635'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-sda', 'timestamp': '2026-01-12T13:50:55.243708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b75999b0-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.222196328, 'message_signature': '4903ef0643a2a66ebc897268eb344d2136fe2273a1d7e2f459071115d20c35c0'}]}, 'timestamp': '2026-01-12 13:50:55.244631', '_unique_id': '4eae4192c84e44c3b94f2425f54c7d8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.245 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.246 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.246 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ce0b076-9fc2-46dc-bc85-1d649a5c6169', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1436, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000b-34997797-aac3-4d72-92c4-a51ee249bd90-tap8a8799e1-01', 'timestamp': '2026-01-12T13:50:55.246079', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'tap8a8799e1-01', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:3f:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a8799e1-01'}, 'message_id': 'b759dc18-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.195020497, 'message_signature': 'bbd1980f9c0eb9f777b88957c6c53bc03e6ac3deab522261edaada918095f3d4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'instance-0000000c-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-tap028fd111-a6', 'timestamp': '2026-01-12T13:50:55.246079', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'tap028fd111-a6', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:db:f1:d5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap028fd111-a6'}, 'message_id': 'b759e582-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.197683877, 'message_signature': '6e943434cdae10dff84a20efaad327f70a2c35c8a898a403d162517abc1b120d'}]}, 'timestamp': '2026-01-12 13:50:55.246577', '_unique_id': '4244d50fb7b84b329ec1f37337f3745a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.248 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.248 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.248 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.248 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c2a53cf-5151-4541-b9cb-e97e7fcd9d93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-vda', 'timestamp': '2026-01-12T13:50:55.247989', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b75a2682-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.280432569, 'message_signature': '4dfcf62e0e1176e9260ea7b9954acc8d6e5455bc405339e218c9c15e0c113273'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90-sda', 'timestamp': '2026-01-12T13:50:55.247989', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b75a2f88-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.280432569, 'message_signature': '4296dfae8e9eac833aaed8ccdbfa950a916a8efafc5f7c7375f43fbf711e4fa5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-vda', 'timestamp': '2026-01-12T13:50:55.247989', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b75a3834-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.287323, 'message_signature': '37a3d8e437e811d1f41eb400df901c0c82720774d7905f7a67069b453ee7cd3a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-sda', 'timestamp': '2026-01-12T13:50:55.247989', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1065387734', 'name': 'instance-0000000c', 'instance_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b75a4108-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.287323, 'message_signature': '2865e91f1a996b0ef977f325aff93a1ee1078a9c93be78fb95eb3c9cabc33c6d'}]}, 'timestamp': '2026-01-12 13:50:55.248935', '_unique_id': 'de13efa087e6407a98c3fe5688e5d147'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.249 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.250 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.250 12 DEBUG ceilometer.compute.pollsters [-] 34997797-aac3-4d72-92c4-a51ee249bd90/memory.usage volume: 40.43359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.250 12 DEBUG ceilometer.compute.pollsters [-] b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.250 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5: ceilometer.compute.pollsters.NoVolumeException
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e214ec2-6b6e-41e1-a88a-f0bb84f97085', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.43359375, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_name': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_name': None, 'resource_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'timestamp': '2026-01-12T13:50:55.250349', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2101764621', 'name': 'instance-0000000b', 'instance_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'instance_type': 'm1.nano', 'host': '2267bcbcb58f2fc606707b520302cecd7fbcfe49aa3ec80d707f50e8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '0bbd7717-2f21-486b-811b-14d24452f9a6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}, 'image_ref': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b75a82bc-efbd-11f0-9e2d-fa163ee03944', 'monotonic_time': 2921.263765185, 'message_signature': 'b1424f8b0d3a6516b5bd8c2710f5ee60b38c073631b1c1007ef3f567cbc063e6'}]}, 'timestamp': '2026-01-12 13:50:55.250843', '_unique_id': '0a8e1f936ac345cea9d22f5a0e53368b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     yield
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 12 13:50:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:50:55.251 12 ERROR oslo_messaging.notify.messaging 
Jan 12 13:50:57 compute-0 nova_compute[181978]: 2026-01-12 13:50:57.263 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:50:57 compute-0 podman[213772]: 2026-01-12 13:50:57.556408289 +0000 UTC m=+0.046624698 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 12 13:50:59 compute-0 nova_compute[181978]: 2026-01-12 13:50:59.510 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:02 compute-0 nova_compute[181978]: 2026-01-12 13:51:02.265 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:02 compute-0 ovn_controller[94974]: 2026-01-12T13:51:02Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:f1:d5 10.100.0.9
Jan 12 13:51:02 compute-0 ovn_controller[94974]: 2026-01-12T13:51:02Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:f1:d5 10.100.0.9
Jan 12 13:51:04 compute-0 nova_compute[181978]: 2026-01-12 13:51:04.513 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:07 compute-0 nova_compute[181978]: 2026-01-12 13:51:07.266 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:08 compute-0 podman[213796]: 2026-01-12 13:51:08.54244591 +0000 UTC m=+0.037481288 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.408 181991 INFO nova.compute.manager [None req-2e7920a4-f4e5-4387-9b25-1ee9e232670e d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Get console output
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.411 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.515 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.972 181991 DEBUG nova.compute.manager [req-282a75bc-e700-416e-9160-ffce25798363 req-6850a2d6-f17e-417e-9674-1a99390de36f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-unplugged-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.972 181991 DEBUG oslo_concurrency.lockutils [req-282a75bc-e700-416e-9160-ffce25798363 req-6850a2d6-f17e-417e-9674-1a99390de36f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.973 181991 DEBUG oslo_concurrency.lockutils [req-282a75bc-e700-416e-9160-ffce25798363 req-6850a2d6-f17e-417e-9674-1a99390de36f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.973 181991 DEBUG oslo_concurrency.lockutils [req-282a75bc-e700-416e-9160-ffce25798363 req-6850a2d6-f17e-417e-9674-1a99390de36f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.973 181991 DEBUG nova.compute.manager [req-282a75bc-e700-416e-9160-ffce25798363 req-6850a2d6-f17e-417e-9674-1a99390de36f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] No waiting events found dispatching network-vif-unplugged-8a8799e1-01b3-47f7-85ad-0355618411ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.973 181991 WARNING nova.compute.manager [req-282a75bc-e700-416e-9160-ffce25798363 req-6850a2d6-f17e-417e-9674-1a99390de36f 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received unexpected event network-vif-unplugged-8a8799e1-01b3-47f7-85ad-0355618411ff for instance with vm_state active and task_state None.
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.976 181991 DEBUG nova.compute.manager [req-56d6fd2f-8479-4caf-9b25-e6f0a952ecd5 req-723ed219-9f67-4996-9b26-68c108e51e38 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.977 181991 DEBUG nova.compute.manager [req-56d6fd2f-8479-4caf-9b25-e6f0a952ecd5 req-723ed219-9f67-4996-9b26-68c108e51e38 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing instance network info cache due to event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.977 181991 DEBUG oslo_concurrency.lockutils [req-56d6fd2f-8479-4caf-9b25-e6f0a952ecd5 req-723ed219-9f67-4996-9b26-68c108e51e38 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.977 181991 DEBUG oslo_concurrency.lockutils [req-56d6fd2f-8479-4caf-9b25-e6f0a952ecd5 req-723ed219-9f67-4996-9b26-68c108e51e38 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:51:09 compute-0 nova_compute[181978]: 2026-01-12 13:51:09.977 181991 DEBUG nova.network.neutron [req-56d6fd2f-8479-4caf-9b25-e6f0a952ecd5 req-723ed219-9f67-4996-9b26-68c108e51e38 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing network info cache for port 8a8799e1-01b3-47f7-85ad-0355618411ff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:51:10 compute-0 nova_compute[181978]: 2026-01-12 13:51:10.738 181991 DEBUG nova.network.neutron [req-56d6fd2f-8479-4caf-9b25-e6f0a952ecd5 req-723ed219-9f67-4996-9b26-68c108e51e38 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updated VIF entry in instance network info cache for port 8a8799e1-01b3-47f7-85ad-0355618411ff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:51:10 compute-0 nova_compute[181978]: 2026-01-12 13:51:10.739 181991 DEBUG nova.network.neutron [req-56d6fd2f-8479-4caf-9b25-e6f0a952ecd5 req-723ed219-9f67-4996-9b26-68c108e51e38 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updating instance_info_cache with network_info: [{"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:51:10 compute-0 nova_compute[181978]: 2026-01-12 13:51:10.756 181991 DEBUG oslo_concurrency.lockutils [req-56d6fd2f-8479-4caf-9b25-e6f0a952ecd5 req-723ed219-9f67-4996-9b26-68c108e51e38 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:51:10 compute-0 nova_compute[181978]: 2026-01-12 13:51:10.991 181991 INFO nova.compute.manager [None req-4393033a-956b-4e2a-8527-264b3fbfd75a d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Get console output
Jan 12 13:51:10 compute-0 nova_compute[181978]: 2026-01-12 13:51:10.995 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:51:11 compute-0 podman[213817]: 2026-01-12 13:51:11.555437748 +0000 UTC m=+0.044327967 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 12 13:51:12 compute-0 nova_compute[181978]: 2026-01-12 13:51:12.034 181991 DEBUG nova.compute.manager [req-1037100a-3d1f-42bb-92ff-31bd5bb1d1df req-f7ca7074-d2f8-4663-a93b-ca4464f09b8c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:12 compute-0 nova_compute[181978]: 2026-01-12 13:51:12.035 181991 DEBUG oslo_concurrency.lockutils [req-1037100a-3d1f-42bb-92ff-31bd5bb1d1df req-f7ca7074-d2f8-4663-a93b-ca4464f09b8c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:12 compute-0 nova_compute[181978]: 2026-01-12 13:51:12.035 181991 DEBUG oslo_concurrency.lockutils [req-1037100a-3d1f-42bb-92ff-31bd5bb1d1df req-f7ca7074-d2f8-4663-a93b-ca4464f09b8c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:12 compute-0 nova_compute[181978]: 2026-01-12 13:51:12.035 181991 DEBUG oslo_concurrency.lockutils [req-1037100a-3d1f-42bb-92ff-31bd5bb1d1df req-f7ca7074-d2f8-4663-a93b-ca4464f09b8c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:12 compute-0 nova_compute[181978]: 2026-01-12 13:51:12.036 181991 DEBUG nova.compute.manager [req-1037100a-3d1f-42bb-92ff-31bd5bb1d1df req-f7ca7074-d2f8-4663-a93b-ca4464f09b8c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] No waiting events found dispatching network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:51:12 compute-0 nova_compute[181978]: 2026-01-12 13:51:12.036 181991 WARNING nova.compute.manager [req-1037100a-3d1f-42bb-92ff-31bd5bb1d1df req-f7ca7074-d2f8-4663-a93b-ca4464f09b8c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received unexpected event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff for instance with vm_state active and task_state None.
Jan 12 13:51:12 compute-0 nova_compute[181978]: 2026-01-12 13:51:12.269 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:12 compute-0 nova_compute[181978]: 2026-01-12 13:51:12.702 181991 INFO nova.compute.manager [None req-ff75e292-b1de-4c6e-9907-85df9173fcf9 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Get console output
Jan 12 13:51:12 compute-0 nova_compute[181978]: 2026-01-12 13:51:12.706 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.404 181991 DEBUG nova.compute.manager [req-276190cc-0920-43a3-9406-55838c2c7ab6 req-ea8411ef-9232-481e-9811-cef24fb487d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received event network-changed-028fd111-a615-4adf-a755-cf7fe0f5b0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.404 181991 DEBUG nova.compute.manager [req-276190cc-0920-43a3-9406-55838c2c7ab6 req-ea8411ef-9232-481e-9811-cef24fb487d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Refreshing instance network info cache due to event network-changed-028fd111-a615-4adf-a755-cf7fe0f5b0d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.404 181991 DEBUG oslo_concurrency.lockutils [req-276190cc-0920-43a3-9406-55838c2c7ab6 req-ea8411ef-9232-481e-9811-cef24fb487d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.404 181991 DEBUG oslo_concurrency.lockutils [req-276190cc-0920-43a3-9406-55838c2c7ab6 req-ea8411ef-9232-481e-9811-cef24fb487d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.405 181991 DEBUG nova.network.neutron [req-276190cc-0920-43a3-9406-55838c2c7ab6 req-ea8411ef-9232-481e-9811-cef24fb487d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Refreshing network info cache for port 028fd111-a615-4adf-a755-cf7fe0f5b0d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.468 181991 DEBUG oslo_concurrency.lockutils [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.468 181991 DEBUG oslo_concurrency.lockutils [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.468 181991 DEBUG oslo_concurrency.lockutils [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.469 181991 DEBUG oslo_concurrency.lockutils [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.469 181991 DEBUG oslo_concurrency.lockutils [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.470 181991 INFO nova.compute.manager [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Terminating instance
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.470 181991 DEBUG nova.compute.manager [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:51:13 compute-0 kernel: tap028fd111-a6 (unregistering): left promiscuous mode
Jan 12 13:51:13 compute-0 NetworkManager[55211]: <info>  [1768225873.4979] device (tap028fd111-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.505 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 ovn_controller[94974]: 2026-01-12T13:51:13Z|00159|binding|INFO|Releasing lport 028fd111-a615-4adf-a755-cf7fe0f5b0d6 from this chassis (sb_readonly=0)
Jan 12 13:51:13 compute-0 ovn_controller[94974]: 2026-01-12T13:51:13Z|00160|binding|INFO|Setting lport 028fd111-a615-4adf-a755-cf7fe0f5b0d6 down in Southbound
Jan 12 13:51:13 compute-0 ovn_controller[94974]: 2026-01-12T13:51:13Z|00161|binding|INFO|Removing iface tap028fd111-a6 ovn-installed in OVS
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.507 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.509 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:f1:d5 10.100.0.9'], port_security=['fa:16:3e:db:f1:d5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '13b83140-3702-4853-ba8b-cdde31476ed3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a028035-7f0e-4424-9a98-a3f388afa919, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=028fd111-a615-4adf-a755-cf7fe0f5b0d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.510 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 028fd111-a615-4adf-a755-cf7fe0f5b0d6 in datapath dcfaf59f-c145-4ceb-8579-9f58575d161f unbound from our chassis
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.511 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dcfaf59f-c145-4ceb-8579-9f58575d161f
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.519 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.525 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[f796b25e-0bef-4710-90f7-e38fcd66b1aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:13 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 12 13:51:13 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 11.767s CPU time.
Jan 12 13:51:13 compute-0 systemd-machined[153581]: Machine qemu-12-instance-0000000c terminated.
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.548 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[122a76f0-f2ed-4ef1-ba9b-db5b9d0f12b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.550 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[18aa3900-3e96-4847-b1c7-aab9478edf2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.570 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[1bede373-4720-45a0-aa9c-44ad2e90eeb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.581 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[c48f97ea-b879-4377-a2e2-10f2bca294a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcfaf59f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:bc:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 290499, 'reachable_time': 25042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213844, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.593 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[92ddd5ec-7464-468d-84d0-423e01361290]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdcfaf59f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 290506, 'tstamp': 290506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213845, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdcfaf59f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 290508, 'tstamp': 290508}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213845, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.594 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcfaf59f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.595 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.598 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.599 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcfaf59f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.599 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.599 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdcfaf59f-c0, col_values=(('external_ids', {'iface-id': '614d27be-df64-4b64-b0c6-45ad7c79ede1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:13 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:13.599 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.683 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.687 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.707 181991 INFO nova.virt.libvirt.driver [-] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Instance destroyed successfully.
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.707 181991 DEBUG nova.objects.instance [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.720 181991 DEBUG nova.virt.libvirt.vif [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:50:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1065387734',display_name='tempest-TestNetworkBasicOps-server-1065387734',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1065387734',id=12,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBObNdm8G3Bb3woErIsQi8zn1PlnkwijV+u0/zGRytOK7GZmUmhXDiKgHezfs87bk9oMJ+2hnBkZK0mYYN+55I9pE+wWpOHP6C/l1lwen1+dU8yrkChT77LnAkMSueXD8Ew==',key_name='tempest-TestNetworkBasicOps-1687337889',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:50:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-xhwravzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:50:51Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.721 181991 DEBUG nova.network.os_vif_util [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.721 181991 DEBUG nova.network.os_vif_util [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:f1:d5,bridge_name='br-int',has_traffic_filtering=True,id=028fd111-a615-4adf-a755-cf7fe0f5b0d6,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap028fd111-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.722 181991 DEBUG os_vif [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:f1:d5,bridge_name='br-int',has_traffic_filtering=True,id=028fd111-a615-4adf-a755-cf7fe0f5b0d6,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap028fd111-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.723 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.723 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap028fd111-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.724 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.726 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.727 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.728 181991 INFO os_vif [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:f1:d5,bridge_name='br-int',has_traffic_filtering=True,id=028fd111-a615-4adf-a755-cf7fe0f5b0d6,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap028fd111-a6')
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.729 181991 INFO nova.virt.libvirt.driver [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Deleting instance files /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5_del
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.729 181991 INFO nova.virt.libvirt.driver [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Deletion of /var/lib/nova/instances/b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5_del complete
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.775 181991 INFO nova.compute.manager [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Took 0.30 seconds to destroy the instance on the hypervisor.
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.775 181991 DEBUG oslo.service.loopingcall [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.775 181991 DEBUG nova.compute.manager [-] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:51:13 compute-0 nova_compute[181978]: 2026-01-12 13:51:13.776 181991 DEBUG nova.network.neutron [-] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.103 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.104 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing instance network info cache due to event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.104 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.104 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.104 181991 DEBUG nova.network.neutron [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing network info cache for port 8a8799e1-01b3-47f7-85ad-0355618411ff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.116 181991 DEBUG nova.network.neutron [req-276190cc-0920-43a3-9406-55838c2c7ab6 req-ea8411ef-9232-481e-9811-cef24fb487d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Updated VIF entry in instance network info cache for port 028fd111-a615-4adf-a755-cf7fe0f5b0d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.117 181991 DEBUG nova.network.neutron [req-276190cc-0920-43a3-9406-55838c2c7ab6 req-ea8411ef-9232-481e-9811-cef24fb487d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Updating instance_info_cache with network_info: [{"id": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "address": "fa:16:3e:db:f1:d5", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap028fd111-a6", "ovs_interfaceid": "028fd111-a615-4adf-a755-cf7fe0f5b0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.128 181991 DEBUG oslo_concurrency.lockutils [req-276190cc-0920-43a3-9406-55838c2c7ab6 req-ea8411ef-9232-481e-9811-cef24fb487d9 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.298 181991 DEBUG nova.network.neutron [-] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.310 181991 INFO nova.compute.manager [-] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Took 0.53 seconds to deallocate network for instance.
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.339 181991 DEBUG oslo_concurrency.lockutils [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.339 181991 DEBUG oslo_concurrency.lockutils [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.387 181991 DEBUG nova.compute.provider_tree [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.397 181991 DEBUG nova.scheduler.client.report [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.411 181991 DEBUG oslo_concurrency.lockutils [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.429 181991 INFO nova.scheduler.client.report [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.480 181991 DEBUG oslo_concurrency.lockutils [None req-810c4466-96e1-4cbf-92b9-e1af9450cd86 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.870 181991 DEBUG nova.network.neutron [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updated VIF entry in instance network info cache for port 8a8799e1-01b3-47f7-85ad-0355618411ff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.871 181991 DEBUG nova.network.neutron [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updating instance_info_cache with network_info: [{"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.884 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.884 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.884 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.884 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.884 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.885 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] No waiting events found dispatching network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.885 181991 WARNING nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received unexpected event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff for instance with vm_state active and task_state None.
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.885 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.885 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.885 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.886 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.886 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] No waiting events found dispatching network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.886 181991 WARNING nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received unexpected event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff for instance with vm_state active and task_state None.
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.886 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received event network-vif-unplugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.886 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.886 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.887 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.887 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] No waiting events found dispatching network-vif-unplugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.887 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received event network-vif-unplugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.887 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received event network-vif-plugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.887 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.887 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.888 181991 DEBUG oslo_concurrency.lockutils [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.888 181991 DEBUG nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] No waiting events found dispatching network-vif-plugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:51:14 compute-0 nova_compute[181978]: 2026-01-12 13:51:14.888 181991 WARNING nova.compute.manager [req-0bb4fa32-44c0-4271-9657-f2389df587f1 req-717901d8-fb77-4f1e-8989-6926fc9b8240 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received unexpected event network-vif-plugged-028fd111-a615-4adf-a755-cf7fe0f5b0d6 for instance with vm_state active and task_state deleting.
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.473 181991 DEBUG nova.compute.manager [req-35d36e75-0e22-4590-b517-802d3be80eec req-2bafe1ba-decb-454f-bc9c-05db1cb0aa2b 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Received event network-vif-deleted-028fd111-a615-4adf-a755-cf7fe0f5b0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.635 181991 DEBUG oslo_concurrency.lockutils [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.635 181991 DEBUG oslo_concurrency.lockutils [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.636 181991 DEBUG oslo_concurrency.lockutils [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.636 181991 DEBUG oslo_concurrency.lockutils [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.636 181991 DEBUG oslo_concurrency.lockutils [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.637 181991 INFO nova.compute.manager [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Terminating instance
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.638 181991 DEBUG nova.compute.manager [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:51:15 compute-0 kernel: tap8a8799e1-01 (unregistering): left promiscuous mode
Jan 12 13:51:15 compute-0 NetworkManager[55211]: <info>  [1768225875.6610] device (tap8a8799e1-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.665 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:15 compute-0 ovn_controller[94974]: 2026-01-12T13:51:15Z|00162|binding|INFO|Releasing lport 8a8799e1-01b3-47f7-85ad-0355618411ff from this chassis (sb_readonly=0)
Jan 12 13:51:15 compute-0 ovn_controller[94974]: 2026-01-12T13:51:15Z|00163|binding|INFO|Setting lport 8a8799e1-01b3-47f7-85ad-0355618411ff down in Southbound
Jan 12 13:51:15 compute-0 ovn_controller[94974]: 2026-01-12T13:51:15Z|00164|binding|INFO|Removing iface tap8a8799e1-01 ovn-installed in OVS
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.668 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.680 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:3f:05 10.100.0.4'], port_security=['fa:16:3e:f7:3f:05 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '34997797-aac3-4d72-92c4-a51ee249bd90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'dcf23cac-c128-431a-843f-97e2cc14c1fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a028035-7f0e-4424-9a98-a3f388afa919, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=8a8799e1-01b3-47f7-85ad-0355618411ff) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.681 104189 INFO neutron.agent.ovn.metadata.agent [-] Port 8a8799e1-01b3-47f7-85ad-0355618411ff in datapath dcfaf59f-c145-4ceb-8579-9f58575d161f unbound from our chassis
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.682 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dcfaf59f-c145-4ceb-8579-9f58575d161f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.683 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.683 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[32e0fb47-b6ef-4ba5-b9d8-7468e730b4cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.683 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f namespace which is not needed anymore
Jan 12 13:51:15 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 12 13:51:15 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 11.362s CPU time.
Jan 12 13:51:15 compute-0 systemd-machined[153581]: Machine qemu-11-instance-0000000b terminated.
Jan 12 13:51:15 compute-0 neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f[213616]: [NOTICE]   (213620) : haproxy version is 2.8.14-c23fe91
Jan 12 13:51:15 compute-0 neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f[213616]: [NOTICE]   (213620) : path to executable is /usr/sbin/haproxy
Jan 12 13:51:15 compute-0 neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f[213616]: [ALERT]    (213620) : Current worker (213622) exited with code 143 (Terminated)
Jan 12 13:51:15 compute-0 neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f[213616]: [WARNING]  (213620) : All workers exited. Exiting... (0)
Jan 12 13:51:15 compute-0 systemd[1]: libpod-02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d.scope: Deactivated successfully.
Jan 12 13:51:15 compute-0 podman[213882]: 2026-01-12 13:51:15.77953888 +0000 UTC m=+0.032876256 container died 02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 12 13:51:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d-userdata-shm.mount: Deactivated successfully.
Jan 12 13:51:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-b890c2b94c9c18807c4d6c4286160e1f219ed65f166d6803cdceeb894f787daf-merged.mount: Deactivated successfully.
Jan 12 13:51:15 compute-0 podman[213882]: 2026-01-12 13:51:15.79961249 +0000 UTC m=+0.052949867 container cleanup 02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 12 13:51:15 compute-0 systemd[1]: libpod-conmon-02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d.scope: Deactivated successfully.
Jan 12 13:51:15 compute-0 podman[213905]: 2026-01-12 13:51:15.840299859 +0000 UTC m=+0.024137787 container remove 02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.843 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[16292216-78b3-4790-829a-b84082c69d9e]: (4, ('Mon Jan 12 01:51:15 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f (02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d)\n02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d\nMon Jan 12 01:51:15 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f (02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d)\n02d397dc8a1c6a921cf04f73b1292ba3cac479a6c3c9c96ed77fdb6d885b253d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.844 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[0c10355c-bc80-4155-90c3-3b121c2ec8b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.845 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcfaf59f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:15 compute-0 kernel: tapdcfaf59f-c0: left promiscuous mode
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.847 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:15 compute-0 NetworkManager[55211]: <info>  [1768225875.8514] manager: (tap8a8799e1-01): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.865 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.867 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3962086b-1ece-487e-93be-b5439a8ab93b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.875 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[0428153e-5e9f-4f2a-8cd1-86c77631a55d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.875 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d800cf-e625-4653-b4c2-ab0c076e42ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.885 181991 INFO nova.virt.libvirt.driver [-] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Instance destroyed successfully.
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.886 181991 DEBUG nova.objects.instance [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid 34997797-aac3-4d72-92c4-a51ee249bd90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.886 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[01bcad22-c27a-4d4a-8789-6f65df6c2bb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 290490, 'reachable_time': 34128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213934, 'error': None, 'target': 'ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:15 compute-0 systemd[1]: run-netns-ovnmeta\x2ddcfaf59f\x2dc145\x2d4ceb\x2d8579\x2d9f58575d161f.mount: Deactivated successfully.
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.890 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dcfaf59f-c145-4ceb-8579-9f58575d161f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:51:15 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:15.890 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[37a89999-339c-489f-a3f3-80e59f6a29bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.896 181991 DEBUG nova.virt.libvirt.vif [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:50:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2101764621',display_name='tempest-TestNetworkBasicOps-server-2101764621',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2101764621',id=11,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDTXY+5Js3/qGeOTbyU7Fp6Taau6MqKO6ctOdJh5uqBdeMuch+L0KSDhlREpRkZyft2pzNm0v0itN3ZI81fO8b75kt4yFKgyIMPxameJ8QaycUc+5JlXnAwQIngaPGZ2bQ==',key_name='tempest-TestNetworkBasicOps-23131883',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:50:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-ypsnzhb4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:50:39Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=34997797-aac3-4d72-92c4-a51ee249bd90,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.896 181991 DEBUG nova.network.os_vif_util [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "8a8799e1-01b3-47f7-85ad-0355618411ff", "address": "fa:16:3e:f7:3f:05", "network": {"id": "dcfaf59f-c145-4ceb-8579-9f58575d161f", "bridge": "br-int", "label": "tempest-network-smoke--962311184", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a8799e1-01", "ovs_interfaceid": "8a8799e1-01b3-47f7-85ad-0355618411ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.897 181991 DEBUG nova.network.os_vif_util [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:3f:05,bridge_name='br-int',has_traffic_filtering=True,id=8a8799e1-01b3-47f7-85ad-0355618411ff,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8799e1-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.897 181991 DEBUG os_vif [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:3f:05,bridge_name='br-int',has_traffic_filtering=True,id=8a8799e1-01b3-47f7-85ad-0355618411ff,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8799e1-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.898 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.898 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a8799e1-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.899 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.902 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.903 181991 INFO os_vif [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:3f:05,bridge_name='br-int',has_traffic_filtering=True,id=8a8799e1-01b3-47f7-85ad-0355618411ff,network=Network(dcfaf59f-c145-4ceb-8579-9f58575d161f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a8799e1-01')
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.904 181991 INFO nova.virt.libvirt.driver [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Deleting instance files /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90_del
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.904 181991 INFO nova.virt.libvirt.driver [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Deletion of /var/lib/nova/instances/34997797-aac3-4d72-92c4-a51ee249bd90_del complete
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.944 181991 INFO nova.compute.manager [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Took 0.31 seconds to destroy the instance on the hypervisor.
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.944 181991 DEBUG oslo.service.loopingcall [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.944 181991 DEBUG nova.compute.manager [-] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:51:15 compute-0 nova_compute[181978]: 2026-01-12 13:51:15.944 181991 DEBUG nova.network.neutron [-] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.293 181991 DEBUG nova.compute.manager [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.294 181991 DEBUG nova.compute.manager [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing instance network info cache due to event network-changed-8a8799e1-01b3-47f7-85ad-0355618411ff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.294 181991 DEBUG oslo_concurrency.lockutils [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.294 181991 DEBUG oslo_concurrency.lockutils [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.294 181991 DEBUG nova.network.neutron [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Refreshing network info cache for port 8a8799e1-01b3-47f7-85ad-0355618411ff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.300 181991 DEBUG nova.network.neutron [-] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.310 181991 INFO nova.compute.manager [-] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Took 0.37 seconds to deallocate network for instance.
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.341 181991 DEBUG oslo_concurrency.lockutils [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.341 181991 DEBUG oslo_concurrency.lockutils [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.376 181991 DEBUG nova.compute.provider_tree [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.388 181991 DEBUG nova.scheduler.client.report [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.405 181991 DEBUG oslo_concurrency.lockutils [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.412 181991 INFO nova.network.neutron [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Port 8a8799e1-01b3-47f7-85ad-0355618411ff from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.412 181991 DEBUG nova.network.neutron [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.421 181991 DEBUG oslo_concurrency.lockutils [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-34997797-aac3-4d72-92c4-a51ee249bd90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.421 181991 DEBUG nova.compute.manager [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-unplugged-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.421 181991 DEBUG oslo_concurrency.lockutils [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.422 181991 DEBUG oslo_concurrency.lockutils [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.422 181991 DEBUG oslo_concurrency.lockutils [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.422 181991 DEBUG nova.compute.manager [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] No waiting events found dispatching network-vif-unplugged-8a8799e1-01b3-47f7-85ad-0355618411ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.422 181991 DEBUG nova.compute.manager [req-e36cd99d-568b-4cbe-9ded-1d9d76fdca9c req-fe18edd8-f359-439b-b302-46a9d33135e1 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-unplugged-8a8799e1-01b3-47f7-85ad-0355618411ff for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.424 181991 INFO nova.scheduler.client.report [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance 34997797-aac3-4d72-92c4-a51ee249bd90
Jan 12 13:51:16 compute-0 nova_compute[181978]: 2026-01-12 13:51:16.463 181991 DEBUG oslo_concurrency.lockutils [None req-f8aea9e1-6e35-4336-a64d-7cd1b7d5aa40 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 0.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:17 compute-0 nova_compute[181978]: 2026-01-12 13:51:17.270 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:17 compute-0 nova_compute[181978]: 2026-01-12 13:51:17.558 181991 DEBUG nova.compute.manager [req-0a60115a-35da-468c-95d9-e617b38ead45 req-7d3ab731-5aaa-44a8-8464-863b541c49f8 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-deleted-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:18 compute-0 nova_compute[181978]: 2026-01-12 13:51:18.354 181991 DEBUG nova.compute.manager [req-c55b84ea-0f1e-40cb-b238-38a255657b80 req-f2a99745-d6b2-4870-935e-c39fa4502d58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:18 compute-0 nova_compute[181978]: 2026-01-12 13:51:18.354 181991 DEBUG oslo_concurrency.lockutils [req-c55b84ea-0f1e-40cb-b238-38a255657b80 req-f2a99745-d6b2-4870-935e-c39fa4502d58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:18 compute-0 nova_compute[181978]: 2026-01-12 13:51:18.355 181991 DEBUG oslo_concurrency.lockutils [req-c55b84ea-0f1e-40cb-b238-38a255657b80 req-f2a99745-d6b2-4870-935e-c39fa4502d58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:18 compute-0 nova_compute[181978]: 2026-01-12 13:51:18.355 181991 DEBUG oslo_concurrency.lockutils [req-c55b84ea-0f1e-40cb-b238-38a255657b80 req-f2a99745-d6b2-4870-935e-c39fa4502d58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "34997797-aac3-4d72-92c4-a51ee249bd90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:18 compute-0 nova_compute[181978]: 2026-01-12 13:51:18.355 181991 DEBUG nova.compute.manager [req-c55b84ea-0f1e-40cb-b238-38a255657b80 req-f2a99745-d6b2-4870-935e-c39fa4502d58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] No waiting events found dispatching network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:51:18 compute-0 nova_compute[181978]: 2026-01-12 13:51:18.356 181991 WARNING nova.compute.manager [req-c55b84ea-0f1e-40cb-b238-38a255657b80 req-f2a99745-d6b2-4870-935e-c39fa4502d58 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Received unexpected event network-vif-plugged-8a8799e1-01b3-47f7-85ad-0355618411ff for instance with vm_state deleted and task_state None.
Jan 12 13:51:18 compute-0 nova_compute[181978]: 2026-01-12 13:51:18.928 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:18.928 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:51:18 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:18.929 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:51:19 compute-0 nova_compute[181978]: 2026-01-12 13:51:19.496 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:51:19 compute-0 nova_compute[181978]: 2026-01-12 13:51:19.497 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:51:19 compute-0 nova_compute[181978]: 2026-01-12 13:51:19.497 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:51:19 compute-0 nova_compute[181978]: 2026-01-12 13:51:19.509 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:51:20 compute-0 nova_compute[181978]: 2026-01-12 13:51:20.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:51:20 compute-0 podman[213940]: 2026-01-12 13:51:20.561466116 +0000 UTC m=+0.051702360 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Jan 12 13:51:20 compute-0 podman[213939]: 2026-01-12 13:51:20.57939358 +0000 UTC m=+0.071195080 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 12 13:51:20 compute-0 podman[213938]: 2026-01-12 13:51:20.57944669 +0000 UTC m=+0.071242248 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 12 13:51:20 compute-0 nova_compute[181978]: 2026-01-12 13:51:20.894 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:20 compute-0 nova_compute[181978]: 2026-01-12 13:51:20.899 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:20 compute-0 nova_compute[181978]: 2026-01-12 13:51:20.969 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.272 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.499 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.499 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.499 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.500 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.689 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.690 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5774MB free_disk=73.38037490844727GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.691 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.691 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.856 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.856 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.871 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.885 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.899 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:51:22 compute-0 nova_compute[181978]: 2026-01-12 13:51:22.899 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:23 compute-0 nova_compute[181978]: 2026-01-12 13:51:23.899 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:51:23 compute-0 nova_compute[181978]: 2026-01-12 13:51:23.899 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:51:23 compute-0 nova_compute[181978]: 2026-01-12 13:51:23.900 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:51:24 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:24.930 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:25 compute-0 nova_compute[181978]: 2026-01-12 13:51:25.900 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:26 compute-0 nova_compute[181978]: 2026-01-12 13:51:26.477 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:51:26 compute-0 nova_compute[181978]: 2026-01-12 13:51:26.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:51:26 compute-0 nova_compute[181978]: 2026-01-12 13:51:26.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:51:27 compute-0 nova_compute[181978]: 2026-01-12 13:51:27.273 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:28 compute-0 podman[214003]: 2026-01-12 13:51:28.545336534 +0000 UTC m=+0.034749480 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 12 13:51:28 compute-0 nova_compute[181978]: 2026-01-12 13:51:28.706 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225873.7052524, b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:51:28 compute-0 nova_compute[181978]: 2026-01-12 13:51:28.706 181991 INFO nova.compute.manager [-] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] VM Stopped (Lifecycle Event)
Jan 12 13:51:28 compute-0 nova_compute[181978]: 2026-01-12 13:51:28.724 181991 DEBUG nova.compute.manager [None req-97262b98-d3d0-479a-ab95-6fae581c230d - - - - - -] [instance: b3cad11d-f82b-4a5d-b4ff-3c80b7bbdcf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:51:30 compute-0 nova_compute[181978]: 2026-01-12 13:51:30.885 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225875.8845193, 34997797-aac3-4d72-92c4-a51ee249bd90 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:51:30 compute-0 nova_compute[181978]: 2026-01-12 13:51:30.885 181991 INFO nova.compute.manager [-] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] VM Stopped (Lifecycle Event)
Jan 12 13:51:30 compute-0 nova_compute[181978]: 2026-01-12 13:51:30.901 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:30 compute-0 nova_compute[181978]: 2026-01-12 13:51:30.903 181991 DEBUG nova.compute.manager [None req-211f7205-e120-4380-bc96-9d98fc81431b - - - - - -] [instance: 34997797-aac3-4d72-92c4-a51ee249bd90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:51:32 compute-0 nova_compute[181978]: 2026-01-12 13:51:32.274 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.128 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.128 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.140 181991 DEBUG nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.197 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.198 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.203 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.203 181991 INFO nova.compute.claims [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Claim successful on node compute-0.ctlplane.example.com
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.269 181991 DEBUG nova.compute.provider_tree [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.276 181991 DEBUG nova.scheduler.client.report [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.287 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.288 181991 DEBUG nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.315 181991 DEBUG nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.315 181991 DEBUG nova.network.neutron [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.332 181991 INFO nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.344 181991 DEBUG nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.412 181991 DEBUG nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.413 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.414 181991 INFO nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Creating image(s)
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.414 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "/var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.414 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.415 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "/var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.425 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.470 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.471 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.471 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.481 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.524 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.525 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.546 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07,backing_fmt=raw /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.547 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.547 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.590 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/143aeda846ec0b9f7fc95749c1b9ce3f1f1aff07 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.591 181991 DEBUG nova.virt.disk.api [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Checking if we can resize image /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.591 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.637 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.638 181991 DEBUG nova.virt.disk.api [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Cannot resize image /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.638 181991 DEBUG nova.objects.instance [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'migration_context' on Instance uuid b449d9a3-fefa-4191-b13a-8cf17c4292c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.653 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.653 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Ensure instance console log exists: /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.653 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.653 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:33 compute-0 nova_compute[181978]: 2026-01-12 13:51:33.654 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:34 compute-0 nova_compute[181978]: 2026-01-12 13:51:34.318 181991 DEBUG nova.policy [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4158a3958504a578730a6b3561138ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c978298f864c4039b47e09202eaf780c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 12 13:51:35 compute-0 nova_compute[181978]: 2026-01-12 13:51:35.903 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:37 compute-0 nova_compute[181978]: 2026-01-12 13:51:37.275 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:37 compute-0 nova_compute[181978]: 2026-01-12 13:51:37.410 181991 DEBUG nova.network.neutron [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Successfully created port: ef32f42e-782a-49a7-b298-f9694b804468 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 12 13:51:38 compute-0 nova_compute[181978]: 2026-01-12 13:51:38.770 181991 DEBUG nova.network.neutron [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Successfully updated port: ef32f42e-782a-49a7-b298-f9694b804468 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 12 13:51:38 compute-0 nova_compute[181978]: 2026-01-12 13:51:38.783 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:51:38 compute-0 nova_compute[181978]: 2026-01-12 13:51:38.784 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquired lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:51:38 compute-0 nova_compute[181978]: 2026-01-12 13:51:38.784 181991 DEBUG nova.network.neutron [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 12 13:51:38 compute-0 nova_compute[181978]: 2026-01-12 13:51:38.834 181991 DEBUG nova.compute.manager [req-199f78f7-a1a8-45d5-b1fc-ec62b3adff17 req-277bdb67-2c13-4154-9cec-bc24c42747fd 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received event network-changed-ef32f42e-782a-49a7-b298-f9694b804468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:38 compute-0 nova_compute[181978]: 2026-01-12 13:51:38.835 181991 DEBUG nova.compute.manager [req-199f78f7-a1a8-45d5-b1fc-ec62b3adff17 req-277bdb67-2c13-4154-9cec-bc24c42747fd 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Refreshing instance network info cache due to event network-changed-ef32f42e-782a-49a7-b298-f9694b804468. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:51:38 compute-0 nova_compute[181978]: 2026-01-12 13:51:38.835 181991 DEBUG oslo_concurrency.lockutils [req-199f78f7-a1a8-45d5-b1fc-ec62b3adff17 req-277bdb67-2c13-4154-9cec-bc24c42747fd 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:51:38 compute-0 nova_compute[181978]: 2026-01-12 13:51:38.874 181991 DEBUG nova.network.neutron [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.378 181991 DEBUG nova.network.neutron [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Updating instance_info_cache with network_info: [{"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.406 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Releasing lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.406 181991 DEBUG nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Instance network_info: |[{"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.406 181991 DEBUG oslo_concurrency.lockutils [req-199f78f7-a1a8-45d5-b1fc-ec62b3adff17 req-277bdb67-2c13-4154-9cec-bc24c42747fd 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.406 181991 DEBUG nova.network.neutron [req-199f78f7-a1a8-45d5-b1fc-ec62b3adff17 req-277bdb67-2c13-4154-9cec-bc24c42747fd 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Refreshing network info cache for port ef32f42e-782a-49a7-b298-f9694b804468 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.408 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Start _get_guest_xml network_info=[{"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'boot_index': 0, 'encryption_options': None, 'image_id': 'bcf708d4-c9eb-4a4c-9503-f846d9f4a560'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.411 181991 WARNING nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.415 181991 DEBUG nova.virt.libvirt.host [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.415 181991 DEBUG nova.virt.libvirt.host [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.421 181991 DEBUG nova.virt.libvirt.host [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.421 181991 DEBUG nova.virt.libvirt.host [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.421 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.421 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-12T13:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0bbd7717-2f21-486b-811b-14d24452f9a6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-12T13:43:36Z,direct_url=<?>,disk_format='qcow2',id=bcf708d4-c9eb-4a4c-9503-f846d9f4a560,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='07608c920d5845209073c2b943d2a58b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-12T13:43:38Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.422 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.422 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.422 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.422 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.423 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.423 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.423 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.423 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.423 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.423 181991 DEBUG nova.virt.hardware [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.426 181991 DEBUG nova.virt.libvirt.vif [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:51:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1179119248',display_name='tempest-TestNetworkBasicOps-server-1179119248',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1179119248',id=13,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDuDakJSlC28NqdQfClvGdWH2uIHquw7GGQzNxmLDpJgSfvu9Kv1ha2IJC4FIKQq24Fm4Mcp8+8ichv49iWIDW3bWILECrmVam9ciCfRMc/hiW/2heaG67GoRrFUKSkGLA==',key_name='tempest-TestNetworkBasicOps-128564872',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-922t7zit',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:51:33Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=b449d9a3-fefa-4191-b13a-8cf17c4292c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.426 181991 DEBUG nova.network.os_vif_util [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.427 181991 DEBUG nova.network.os_vif_util [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:aa:d8,bridge_name='br-int',has_traffic_filtering=True,id=ef32f42e-782a-49a7-b298-f9694b804468,network=Network(32cf8d0a-4048-4450-b516-f3e1b18206b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef32f42e-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.428 181991 DEBUG nova.objects.instance [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'pci_devices' on Instance uuid b449d9a3-fefa-4191-b13a-8cf17c4292c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.435 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] End _get_guest_xml xml=<domain type="kvm">
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <uuid>b449d9a3-fefa-4191-b13a-8cf17c4292c5</uuid>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <name>instance-0000000d</name>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <memory>131072</memory>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <vcpu>1</vcpu>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <metadata>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <nova:name>tempest-TestNetworkBasicOps-server-1179119248</nova:name>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <nova:creationTime>2026-01-12 13:51:39</nova:creationTime>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <nova:flavor name="m1.nano">
Jan 12 13:51:39 compute-0 nova_compute[181978]:         <nova:memory>128</nova:memory>
Jan 12 13:51:39 compute-0 nova_compute[181978]:         <nova:disk>1</nova:disk>
Jan 12 13:51:39 compute-0 nova_compute[181978]:         <nova:swap>0</nova:swap>
Jan 12 13:51:39 compute-0 nova_compute[181978]:         <nova:ephemeral>0</nova:ephemeral>
Jan 12 13:51:39 compute-0 nova_compute[181978]:         <nova:vcpus>1</nova:vcpus>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       </nova:flavor>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <nova:owner>
Jan 12 13:51:39 compute-0 nova_compute[181978]:         <nova:user uuid="d4158a3958504a578730a6b3561138ce">tempest-TestNetworkBasicOps-868482653-project-member</nova:user>
Jan 12 13:51:39 compute-0 nova_compute[181978]:         <nova:project uuid="c978298f864c4039b47e09202eaf780c">tempest-TestNetworkBasicOps-868482653</nova:project>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       </nova:owner>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <nova:root type="image" uuid="bcf708d4-c9eb-4a4c-9503-f846d9f4a560"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <nova:ports>
Jan 12 13:51:39 compute-0 nova_compute[181978]:         <nova:port uuid="ef32f42e-782a-49a7-b298-f9694b804468">
Jan 12 13:51:39 compute-0 nova_compute[181978]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:         </nova:port>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       </nova:ports>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     </nova:instance>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   </metadata>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <sysinfo type="smbios">
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <system>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <entry name="manufacturer">RDO</entry>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <entry name="product">OpenStack Compute</entry>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <entry name="serial">b449d9a3-fefa-4191-b13a-8cf17c4292c5</entry>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <entry name="uuid">b449d9a3-fefa-4191-b13a-8cf17c4292c5</entry>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <entry name="family">Virtual Machine</entry>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     </system>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   </sysinfo>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <os>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <boot dev="hd"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <smbios mode="sysinfo"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   </os>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <features>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <acpi/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <apic/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <vmcoreinfo/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   </features>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <clock offset="utc">
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <timer name="pit" tickpolicy="delay"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <timer name="hpet" present="no"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   </clock>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <cpu mode="host-model" match="exact">
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <topology sockets="1" cores="1" threads="1"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   </cpu>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   <devices>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <disk type="file" device="disk">
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <target dev="vda" bus="virtio"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <disk type="file" device="cdrom">
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <driver name="qemu" type="raw" cache="none"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <source file="/var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk.config"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <target dev="sda" bus="sata"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     </disk>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <interface type="ethernet">
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <mac address="fa:16:3e:a9:aa:d8"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <driver name="vhost" rx_queue_size="512"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <mtu size="1442"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <target dev="tapef32f42e-78"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     </interface>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <serial type="pty">
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <log file="/var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/console.log" append="off"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     </serial>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <video>
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <model type="virtio"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     </video>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <input type="tablet" bus="usb"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <rng model="virtio">
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <backend model="random">/dev/urandom</backend>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     </rng>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="pci" model="pcie-root-port"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <controller type="usb" index="0"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     <memballoon model="virtio">
Jan 12 13:51:39 compute-0 nova_compute[181978]:       <stats period="10"/>
Jan 12 13:51:39 compute-0 nova_compute[181978]:     </memballoon>
Jan 12 13:51:39 compute-0 nova_compute[181978]:   </devices>
Jan 12 13:51:39 compute-0 nova_compute[181978]: </domain>
Jan 12 13:51:39 compute-0 nova_compute[181978]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.436 181991 DEBUG nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Preparing to wait for external event network-vif-plugged-ef32f42e-782a-49a7-b298-f9694b804468 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.436 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.436 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.436 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.437 181991 DEBUG nova.virt.libvirt.vif [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-12T13:51:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1179119248',display_name='tempest-TestNetworkBasicOps-server-1179119248',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1179119248',id=13,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDuDakJSlC28NqdQfClvGdWH2uIHquw7GGQzNxmLDpJgSfvu9Kv1ha2IJC4FIKQq24Fm4Mcp8+8ichv49iWIDW3bWILECrmVam9ciCfRMc/hiW/2heaG67GoRrFUKSkGLA==',key_name='tempest-TestNetworkBasicOps-128564872',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-922t7zit',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-12T13:51:33Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=b449d9a3-fefa-4191-b13a-8cf17c4292c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.437 181991 DEBUG nova.network.os_vif_util [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.437 181991 DEBUG nova.network.os_vif_util [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:aa:d8,bridge_name='br-int',has_traffic_filtering=True,id=ef32f42e-782a-49a7-b298-f9694b804468,network=Network(32cf8d0a-4048-4450-b516-f3e1b18206b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef32f42e-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.438 181991 DEBUG os_vif [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:aa:d8,bridge_name='br-int',has_traffic_filtering=True,id=ef32f42e-782a-49a7-b298-f9694b804468,network=Network(32cf8d0a-4048-4450-b516-f3e1b18206b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef32f42e-78') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.438 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.438 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.439 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.440 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.441 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef32f42e-78, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.441 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapef32f42e-78, col_values=(('external_ids', {'iface-id': 'ef32f42e-782a-49a7-b298-f9694b804468', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:aa:d8', 'vm-uuid': 'b449d9a3-fefa-4191-b13a-8cf17c4292c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:39 compute-0 NetworkManager[55211]: <info>  [1768225899.4433] manager: (tapef32f42e-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.444 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.446 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.446 181991 INFO os_vif [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:aa:d8,bridge_name='br-int',has_traffic_filtering=True,id=ef32f42e-782a-49a7-b298-f9694b804468,network=Network(32cf8d0a-4048-4450-b516-f3e1b18206b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef32f42e-78')
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.475 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.475 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.475 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] No VIF found with MAC fa:16:3e:a9:aa:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 12 13:51:39 compute-0 nova_compute[181978]: 2026-01-12 13:51:39.475 181991 INFO nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Using config drive
Jan 12 13:51:39 compute-0 podman[214037]: 2026-01-12 13:51:39.545500692 +0000 UTC m=+0.038772086 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.205 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.205 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.205 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.495 181991 INFO nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Creating config drive at /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk.config
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.499 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo5qzvgkj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.617 181991 DEBUG oslo_concurrency.processutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo5qzvgkj" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:51:40 compute-0 kernel: tapef32f42e-78: entered promiscuous mode
Jan 12 13:51:40 compute-0 NetworkManager[55211]: <info>  [1768225900.6559] manager: (tapef32f42e-78): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Jan 12 13:51:40 compute-0 ovn_controller[94974]: 2026-01-12T13:51:40Z|00165|binding|INFO|Claiming lport ef32f42e-782a-49a7-b298-f9694b804468 for this chassis.
Jan 12 13:51:40 compute-0 ovn_controller[94974]: 2026-01-12T13:51:40Z|00166|binding|INFO|ef32f42e-782a-49a7-b298-f9694b804468: Claiming fa:16:3e:a9:aa:d8 10.100.0.8
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.658 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.675 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:aa:d8 10.100.0.8'], port_security=['fa:16:3e:a9:aa:d8 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b449d9a3-fefa-4191-b13a-8cf17c4292c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-32cf8d0a-4048-4450-b516-f3e1b18206b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '384e5c02-aa7e-4fb8-bedb-232989afa169', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=818fb4cb-d61b-4330-a5a9-5522d2572cd7, chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=ef32f42e-782a-49a7-b298-f9694b804468) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.675 104189 INFO neutron.agent.ovn.metadata.agent [-] Port ef32f42e-782a-49a7-b298-f9694b804468 in datapath 32cf8d0a-4048-4450-b516-f3e1b18206b2 bound to our chassis
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.676 104189 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 32cf8d0a-4048-4450-b516-f3e1b18206b2
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.684 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7217e6-2ae8-46c0-8e87-fd8739ad7e4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.684 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap32cf8d0a-41 in ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 12 13:51:40 compute-0 systemd-udevd[214078]: Network interface NamePolicy= disabled on kernel command line.
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.687 209930 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap32cf8d0a-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.687 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[b2cb097c-27f0-40c0-8515-eb757481e86a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.688 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd70c1f-3394-486d-a497-c0cfcb8419b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 systemd-machined[153581]: New machine qemu-13-instance-0000000d.
Jan 12 13:51:40 compute-0 NetworkManager[55211]: <info>  [1768225900.6973] device (tapef32f42e-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 12 13:51:40 compute-0 NetworkManager[55211]: <info>  [1768225900.6977] device (tapef32f42e-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.701 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[b5dd2012-2fc3-4537-ba62-0dce9928b32c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.717 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:40 compute-0 ovn_controller[94974]: 2026-01-12T13:51:40Z|00167|binding|INFO|Setting lport ef32f42e-782a-49a7-b298-f9694b804468 ovn-installed in OVS
Jan 12 13:51:40 compute-0 ovn_controller[94974]: 2026-01-12T13:51:40Z|00168|binding|INFO|Setting lport ef32f42e-782a-49a7-b298-f9694b804468 up in Southbound
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.721 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.729 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb62035-8d5e-4ad8-bc2b-a71bb8d71eaa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.750 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[022bccbf-c1f6-497f-a0d1-5f3fd9bfc8a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 NetworkManager[55211]: <info>  [1768225900.7553] manager: (tap32cf8d0a-40): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.756 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1b07ee12-7fdc-4bb4-a2e2-c6048ef458d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.778 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[37490b0a-7c83-4fc2-a041-4ec1b0ed32c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.780 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[786a140b-0fe2-4f3f-b75b-8be9c3787f6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 NetworkManager[55211]: <info>  [1768225900.7956] device (tap32cf8d0a-40): carrier: link connected
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.800 209970 DEBUG oslo.privsep.daemon [-] privsep: reply[70c09588-d8b5-4099-88ff-ec412f35f35e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.812 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a82363-ea59-4ea2-b756-2732167eb9dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap32cf8d0a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:29:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 296682, 'reachable_time': 31572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214102, 'error': None, 'target': 'ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.824 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a27c47-26a7-4947-91a4-a19e2e5c2d06]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed9:2990'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 296682, 'tstamp': 296682}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214103, 'error': None, 'target': 'ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.836 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4e7917-c0bd-4f89-86bc-596cdbdf3320]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap32cf8d0a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:29:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 296682, 'reachable_time': 31572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214104, 'error': None, 'target': 'ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.858 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4e3ef5-0b8d-4586-9ed1-48820e139746]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.903 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[c0284c18-f9e1-4c45-810b-234c93c48d8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.904 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap32cf8d0a-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.904 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.905 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap32cf8d0a-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:40 compute-0 NetworkManager[55211]: <info>  [1768225900.9073] manager: (tap32cf8d0a-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Jan 12 13:51:40 compute-0 kernel: tap32cf8d0a-40: entered promiscuous mode
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.909 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap32cf8d0a-40, col_values=(('external_ids', {'iface-id': 'e6853e4f-6c6d-4b63-941a-c93a069019de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.909 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:40 compute-0 ovn_controller[94974]: 2026-01-12T13:51:40Z|00169|binding|INFO|Releasing lport e6853e4f-6c6d-4b63-941a-c93a069019de from this chassis (sb_readonly=0)
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.911 104189 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/32cf8d0a-4048-4450-b516-f3e1b18206b2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/32cf8d0a-4048-4450-b516-f3e1b18206b2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.912 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e22ed2-4eaa-476d-a710-3b2d6fc7d42b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.913 104189 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: global
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     log         /dev/log local0 debug
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     log-tag     haproxy-metadata-proxy-32cf8d0a-4048-4450-b516-f3e1b18206b2
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     user        root
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     group       root
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     maxconn     1024
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     pidfile     /var/lib/neutron/external/pids/32cf8d0a-4048-4450-b516-f3e1b18206b2.pid.haproxy
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     daemon
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: defaults
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     log global
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     mode http
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     option httplog
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     option dontlognull
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     option http-server-close
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     option forwardfor
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     retries                 3
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     timeout http-request    30s
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     timeout connect         30s
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     timeout client          32s
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     timeout server          32s
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     timeout http-keep-alive 30s
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: listen listener
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     bind 169.254.169.254:80
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     server metadata /var/lib/neutron/metadata_proxy
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:     http-request add-header X-OVN-Network-ID 32cf8d0a-4048-4450-b516-f3e1b18206b2
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 12 13:51:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:51:40.914 104189 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2', 'env', 'PROCESS_TAG=haproxy-32cf8d0a-4048-4450-b516-f3e1b18206b2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/32cf8d0a-4048-4450-b516-f3e1b18206b2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.924 181991 DEBUG nova.compute.manager [req-d41e0cca-a849-4a5c-8b4d-6770defca8c9 req-96a04d75-8c5f-421a-92b0-8972dbff5660 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received event network-vif-plugged-ef32f42e-782a-49a7-b298-f9694b804468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.925 181991 DEBUG oslo_concurrency.lockutils [req-d41e0cca-a849-4a5c-8b4d-6770defca8c9 req-96a04d75-8c5f-421a-92b0-8972dbff5660 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.926 181991 DEBUG oslo_concurrency.lockutils [req-d41e0cca-a849-4a5c-8b4d-6770defca8c9 req-96a04d75-8c5f-421a-92b0-8972dbff5660 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.926 181991 DEBUG oslo_concurrency.lockutils [req-d41e0cca-a849-4a5c-8b4d-6770defca8c9 req-96a04d75-8c5f-421a-92b0-8972dbff5660 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.926 181991 DEBUG nova.compute.manager [req-d41e0cca-a849-4a5c-8b4d-6770defca8c9 req-96a04d75-8c5f-421a-92b0-8972dbff5660 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Processing event network-vif-plugged-ef32f42e-782a-49a7-b298-f9694b804468 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.927 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.955 181991 DEBUG nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.956 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225900.9561226, b449d9a3-fefa-4191-b13a-8cf17c4292c5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.956 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] VM Started (Lifecycle Event)
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.959 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.961 181991 INFO nova.virt.libvirt.driver [-] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Instance spawned successfully.
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.961 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.976 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.979 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.979 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.980 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.980 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.980 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.981 181991 DEBUG nova.virt.libvirt.driver [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 12 13:51:40 compute-0 nova_compute[181978]: 2026-01-12 13:51:40.984 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.012 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.017 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225900.956363, b449d9a3-fefa-4191-b13a-8cf17c4292c5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.017 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] VM Paused (Lifecycle Event)
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.033 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.035 181991 DEBUG nova.virt.driver [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] Emitting event <LifecycleEvent: 1768225900.9584432, b449d9a3-fefa-4191-b13a-8cf17c4292c5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.036 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] VM Resumed (Lifecycle Event)
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.042 181991 INFO nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Took 7.63 seconds to spawn the instance on the hypervisor.
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.042 181991 DEBUG nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.057 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.059 181991 DEBUG nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.077 181991 INFO nova.compute.manager [None req-0b6a32b1-2b12-4983-9bc7-f638f868e5f3 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.085 181991 INFO nova.compute.manager [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Took 7.91 seconds to build instance.
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.095 181991 DEBUG oslo_concurrency.lockutils [None req-77e90ed8-b5ea-4ece-b60d-b777bfc552f7 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.967s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.179 181991 DEBUG nova.network.neutron [req-199f78f7-a1a8-45d5-b1fc-ec62b3adff17 req-277bdb67-2c13-4154-9cec-bc24c42747fd 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Updated VIF entry in instance network info cache for port ef32f42e-782a-49a7-b298-f9694b804468. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.179 181991 DEBUG nova.network.neutron [req-199f78f7-a1a8-45d5-b1fc-ec62b3adff17 req-277bdb67-2c13-4154-9cec-bc24c42747fd 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Updating instance_info_cache with network_info: [{"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:51:41 compute-0 nova_compute[181978]: 2026-01-12 13:51:41.193 181991 DEBUG oslo_concurrency.lockutils [req-199f78f7-a1a8-45d5-b1fc-ec62b3adff17 req-277bdb67-2c13-4154-9cec-bc24c42747fd 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:51:41 compute-0 podman[214139]: 2026-01-12 13:51:41.224396284 +0000 UTC m=+0.042140563 container create 6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 12 13:51:41 compute-0 systemd[1]: Started libpod-conmon-6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8.scope.
Jan 12 13:51:41 compute-0 systemd[1]: Started libcrun container.
Jan 12 13:51:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbeb97c9b3cfa26cfa5b87b853058cf3922a3be195c4de958b67a9a78d3a7c19/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 12 13:51:41 compute-0 podman[214139]: 2026-01-12 13:51:41.294412594 +0000 UTC m=+0.112156894 container init 6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:51:41 compute-0 podman[214139]: 2026-01-12 13:51:41.29967871 +0000 UTC m=+0.117422990 container start 6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:51:41 compute-0 podman[214139]: 2026-01-12 13:51:41.204478956 +0000 UTC m=+0.022223246 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 12 13:51:41 compute-0 neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2[214152]: [NOTICE]   (214156) : New worker (214158) forked
Jan 12 13:51:41 compute-0 neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2[214152]: [NOTICE]   (214156) : Loading success.
Jan 12 13:51:42 compute-0 nova_compute[181978]: 2026-01-12 13:51:42.277 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:42 compute-0 podman[214163]: 2026-01-12 13:51:42.547224573 +0000 UTC m=+0.041092452 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 12 13:51:42 compute-0 nova_compute[181978]: 2026-01-12 13:51:42.989 181991 DEBUG nova.compute.manager [req-c8f4ef1a-d165-448c-b5d0-a26dd9121fc1 req-70193506-8991-41f7-a14d-c90b0141b492 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received event network-vif-plugged-ef32f42e-782a-49a7-b298-f9694b804468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:42 compute-0 nova_compute[181978]: 2026-01-12 13:51:42.990 181991 DEBUG oslo_concurrency.lockutils [req-c8f4ef1a-d165-448c-b5d0-a26dd9121fc1 req-70193506-8991-41f7-a14d-c90b0141b492 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:51:42 compute-0 nova_compute[181978]: 2026-01-12 13:51:42.990 181991 DEBUG oslo_concurrency.lockutils [req-c8f4ef1a-d165-448c-b5d0-a26dd9121fc1 req-70193506-8991-41f7-a14d-c90b0141b492 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:51:42 compute-0 nova_compute[181978]: 2026-01-12 13:51:42.990 181991 DEBUG oslo_concurrency.lockutils [req-c8f4ef1a-d165-448c-b5d0-a26dd9121fc1 req-70193506-8991-41f7-a14d-c90b0141b492 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:51:42 compute-0 nova_compute[181978]: 2026-01-12 13:51:42.990 181991 DEBUG nova.compute.manager [req-c8f4ef1a-d165-448c-b5d0-a26dd9121fc1 req-70193506-8991-41f7-a14d-c90b0141b492 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] No waiting events found dispatching network-vif-plugged-ef32f42e-782a-49a7-b298-f9694b804468 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:51:42 compute-0 nova_compute[181978]: 2026-01-12 13:51:42.991 181991 WARNING nova.compute.manager [req-c8f4ef1a-d165-448c-b5d0-a26dd9121fc1 req-70193506-8991-41f7-a14d-c90b0141b492 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received unexpected event network-vif-plugged-ef32f42e-782a-49a7-b298-f9694b804468 for instance with vm_state active and task_state None.
Jan 12 13:51:44 compute-0 nova_compute[181978]: 2026-01-12 13:51:44.443 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:44 compute-0 ovn_controller[94974]: 2026-01-12T13:51:44Z|00170|binding|INFO|Releasing lport e6853e4f-6c6d-4b63-941a-c93a069019de from this chassis (sb_readonly=0)
Jan 12 13:51:44 compute-0 nova_compute[181978]: 2026-01-12 13:51:44.500 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:44 compute-0 NetworkManager[55211]: <info>  [1768225904.5023] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 12 13:51:44 compute-0 NetworkManager[55211]: <info>  [1768225904.5028] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 12 13:51:44 compute-0 nova_compute[181978]: 2026-01-12 13:51:44.532 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:44 compute-0 ovn_controller[94974]: 2026-01-12T13:51:44Z|00171|binding|INFO|Releasing lport e6853e4f-6c6d-4b63-941a-c93a069019de from this chassis (sb_readonly=0)
Jan 12 13:51:44 compute-0 nova_compute[181978]: 2026-01-12 13:51:44.534 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:45 compute-0 nova_compute[181978]: 2026-01-12 13:51:45.055 181991 DEBUG nova.compute.manager [req-4c2dd6f0-7b16-4d14-91f3-8bcd681216ae req-3aa9b7ef-04ff-46b3-ad02-4c57cc0e2dc7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received event network-changed-ef32f42e-782a-49a7-b298-f9694b804468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:51:45 compute-0 nova_compute[181978]: 2026-01-12 13:51:45.055 181991 DEBUG nova.compute.manager [req-4c2dd6f0-7b16-4d14-91f3-8bcd681216ae req-3aa9b7ef-04ff-46b3-ad02-4c57cc0e2dc7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Refreshing instance network info cache due to event network-changed-ef32f42e-782a-49a7-b298-f9694b804468. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:51:45 compute-0 nova_compute[181978]: 2026-01-12 13:51:45.055 181991 DEBUG oslo_concurrency.lockutils [req-4c2dd6f0-7b16-4d14-91f3-8bcd681216ae req-3aa9b7ef-04ff-46b3-ad02-4c57cc0e2dc7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:51:45 compute-0 nova_compute[181978]: 2026-01-12 13:51:45.055 181991 DEBUG oslo_concurrency.lockutils [req-4c2dd6f0-7b16-4d14-91f3-8bcd681216ae req-3aa9b7ef-04ff-46b3-ad02-4c57cc0e2dc7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:51:45 compute-0 nova_compute[181978]: 2026-01-12 13:51:45.056 181991 DEBUG nova.network.neutron [req-4c2dd6f0-7b16-4d14-91f3-8bcd681216ae req-3aa9b7ef-04ff-46b3-ad02-4c57cc0e2dc7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Refreshing network info cache for port ef32f42e-782a-49a7-b298-f9694b804468 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:51:46 compute-0 nova_compute[181978]: 2026-01-12 13:51:46.312 181991 DEBUG nova.network.neutron [req-4c2dd6f0-7b16-4d14-91f3-8bcd681216ae req-3aa9b7ef-04ff-46b3-ad02-4c57cc0e2dc7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Updated VIF entry in instance network info cache for port ef32f42e-782a-49a7-b298-f9694b804468. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:51:46 compute-0 nova_compute[181978]: 2026-01-12 13:51:46.312 181991 DEBUG nova.network.neutron [req-4c2dd6f0-7b16-4d14-91f3-8bcd681216ae req-3aa9b7ef-04ff-46b3-ad02-4c57cc0e2dc7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Updating instance_info_cache with network_info: [{"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:51:46 compute-0 nova_compute[181978]: 2026-01-12 13:51:46.330 181991 DEBUG oslo_concurrency.lockutils [req-4c2dd6f0-7b16-4d14-91f3-8bcd681216ae req-3aa9b7ef-04ff-46b3-ad02-4c57cc0e2dc7 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:51:47 compute-0 nova_compute[181978]: 2026-01-12 13:51:47.278 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:49 compute-0 nova_compute[181978]: 2026-01-12 13:51:49.445 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:50 compute-0 ovn_controller[94974]: 2026-01-12T13:51:50Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:aa:d8 10.100.0.8
Jan 12 13:51:50 compute-0 ovn_controller[94974]: 2026-01-12T13:51:50Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:aa:d8 10.100.0.8
Jan 12 13:51:51 compute-0 podman[214191]: 2026-01-12 13:51:51.579450803 +0000 UTC m=+0.056190544 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Jan 12 13:51:51 compute-0 podman[214189]: 2026-01-12 13:51:51.59755062 +0000 UTC m=+0.079782271 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 12 13:51:51 compute-0 podman[214190]: 2026-01-12 13:51:51.599076311 +0000 UTC m=+0.077731233 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:51:52 compute-0 nova_compute[181978]: 2026-01-12 13:51:52.281 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:54 compute-0 nova_compute[181978]: 2026-01-12 13:51:54.446 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:57 compute-0 nova_compute[181978]: 2026-01-12 13:51:57.128 181991 INFO nova.compute.manager [None req-03c009f4-aa83-424a-acb7-509acb53b9ad d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Get console output
Jan 12 13:51:57 compute-0 nova_compute[181978]: 2026-01-12 13:51:57.132 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:51:57 compute-0 nova_compute[181978]: 2026-01-12 13:51:57.282 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:58 compute-0 ovn_controller[94974]: 2026-01-12T13:51:58Z|00172|binding|INFO|Releasing lport e6853e4f-6c6d-4b63-941a-c93a069019de from this chassis (sb_readonly=0)
Jan 12 13:51:58 compute-0 nova_compute[181978]: 2026-01-12 13:51:58.373 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:58 compute-0 ovn_controller[94974]: 2026-01-12T13:51:58Z|00173|binding|INFO|Releasing lport e6853e4f-6c6d-4b63-941a-c93a069019de from this chassis (sb_readonly=0)
Jan 12 13:51:58 compute-0 nova_compute[181978]: 2026-01-12 13:51:58.426 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:59 compute-0 nova_compute[181978]: 2026-01-12 13:51:59.448 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:51:59 compute-0 podman[214252]: 2026-01-12 13:51:59.542612898 +0000 UTC m=+0.036120589 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 12 13:51:59 compute-0 nova_compute[181978]: 2026-01-12 13:51:59.545 181991 INFO nova.compute.manager [None req-50df01fe-a399-4c3a-a83f-49e97f5057e2 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Get console output
Jan 12 13:51:59 compute-0 nova_compute[181978]: 2026-01-12 13:51:59.549 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:52:01 compute-0 nova_compute[181978]: 2026-01-12 13:52:01.325 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:01 compute-0 NetworkManager[55211]: <info>  [1768225921.3262] manager: (patch-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 12 13:52:01 compute-0 NetworkManager[55211]: <info>  [1768225921.3268] manager: (patch-br-int-to-provnet-94221086-ac91-4d8d-8a23-18ab95fd0518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Jan 12 13:52:01 compute-0 nova_compute[181978]: 2026-01-12 13:52:01.378 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:01 compute-0 ovn_controller[94974]: 2026-01-12T13:52:01Z|00174|binding|INFO|Releasing lport e6853e4f-6c6d-4b63-941a-c93a069019de from this chassis (sb_readonly=0)
Jan 12 13:52:01 compute-0 nova_compute[181978]: 2026-01-12 13:52:01.382 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:01 compute-0 nova_compute[181978]: 2026-01-12 13:52:01.608 181991 INFO nova.compute.manager [None req-f264a924-55d3-4bed-9df9-d4ea0495c884 d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Get console output
Jan 12 13:52:01 compute-0 nova_compute[181978]: 2026-01-12 13:52:01.612 209863 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.284 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.622 181991 DEBUG nova.compute.manager [req-3a3090e2-791a-44da-9995-2d57809853ff req-e2f80394-d03a-4e26-9189-b4b922c45812 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received event network-changed-ef32f42e-782a-49a7-b298-f9694b804468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.622 181991 DEBUG nova.compute.manager [req-3a3090e2-791a-44da-9995-2d57809853ff req-e2f80394-d03a-4e26-9189-b4b922c45812 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Refreshing instance network info cache due to event network-changed-ef32f42e-782a-49a7-b298-f9694b804468. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.623 181991 DEBUG oslo_concurrency.lockutils [req-3a3090e2-791a-44da-9995-2d57809853ff req-e2f80394-d03a-4e26-9189-b4b922c45812 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.623 181991 DEBUG oslo_concurrency.lockutils [req-3a3090e2-791a-44da-9995-2d57809853ff req-e2f80394-d03a-4e26-9189-b4b922c45812 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquired lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.624 181991 DEBUG nova.network.neutron [req-3a3090e2-791a-44da-9995-2d57809853ff req-e2f80394-d03a-4e26-9189-b4b922c45812 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Refreshing network info cache for port ef32f42e-782a-49a7-b298-f9694b804468 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.652 181991 DEBUG oslo_concurrency.lockutils [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.652 181991 DEBUG oslo_concurrency.lockutils [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.652 181991 DEBUG oslo_concurrency.lockutils [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.653 181991 DEBUG oslo_concurrency.lockutils [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.653 181991 DEBUG oslo_concurrency.lockutils [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.654 181991 INFO nova.compute.manager [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Terminating instance
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.655 181991 DEBUG nova.compute.manager [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 12 13:52:02 compute-0 kernel: tapef32f42e-78 (unregistering): left promiscuous mode
Jan 12 13:52:02 compute-0 NetworkManager[55211]: <info>  [1768225922.6735] device (tapef32f42e-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 12 13:52:02 compute-0 ovn_controller[94974]: 2026-01-12T13:52:02Z|00175|binding|INFO|Releasing lport ef32f42e-782a-49a7-b298-f9694b804468 from this chassis (sb_readonly=0)
Jan 12 13:52:02 compute-0 ovn_controller[94974]: 2026-01-12T13:52:02Z|00176|binding|INFO|Setting lport ef32f42e-782a-49a7-b298-f9694b804468 down in Southbound
Jan 12 13:52:02 compute-0 ovn_controller[94974]: 2026-01-12T13:52:02Z|00177|binding|INFO|Removing iface tapef32f42e-78 ovn-installed in OVS
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.682 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.694 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:aa:d8 10.100.0.8'], port_security=['fa:16:3e:a9:aa:d8 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b449d9a3-fefa-4191-b13a-8cf17c4292c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-32cf8d0a-4048-4450-b516-f3e1b18206b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c978298f864c4039b47e09202eaf780c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '384e5c02-aa7e-4fb8-bedb-232989afa169', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=818fb4cb-d61b-4330-a5a9-5522d2572cd7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>], logical_port=ef32f42e-782a-49a7-b298-f9694b804468) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe37841e610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.696 104189 INFO neutron.agent.ovn.metadata.agent [-] Port ef32f42e-782a-49a7-b298-f9694b804468 in datapath 32cf8d0a-4048-4450-b516-f3e1b18206b2 unbound from our chassis
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.699 104189 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 32cf8d0a-4048-4450-b516-f3e1b18206b2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.701 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8249e602-5e8d-490d-b750-07a69976fb78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.704 104189 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2 namespace which is not needed anymore
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.706 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 12 13:52:02 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 10.564s CPU time.
Jan 12 13:52:02 compute-0 systemd-machined[153581]: Machine qemu-13-instance-0000000d terminated.
Jan 12 13:52:02 compute-0 neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2[214152]: [NOTICE]   (214156) : haproxy version is 2.8.14-c23fe91
Jan 12 13:52:02 compute-0 neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2[214152]: [NOTICE]   (214156) : path to executable is /usr/sbin/haproxy
Jan 12 13:52:02 compute-0 neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2[214152]: [WARNING]  (214156) : Exiting Master process...
Jan 12 13:52:02 compute-0 neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2[214152]: [WARNING]  (214156) : Exiting Master process...
Jan 12 13:52:02 compute-0 neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2[214152]: [ALERT]    (214156) : Current worker (214158) exited with code 143 (Terminated)
Jan 12 13:52:02 compute-0 neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2[214152]: [WARNING]  (214156) : All workers exited. Exiting... (0)
Jan 12 13:52:02 compute-0 systemd[1]: libpod-6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8.scope: Deactivated successfully.
Jan 12 13:52:02 compute-0 podman[214290]: 2026-01-12 13:52:02.814511625 +0000 UTC m=+0.033935040 container died 6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 12 13:52:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8-userdata-shm.mount: Deactivated successfully.
Jan 12 13:52:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-fbeb97c9b3cfa26cfa5b87b853058cf3922a3be195c4de958b67a9a78d3a7c19-merged.mount: Deactivated successfully.
Jan 12 13:52:02 compute-0 podman[214290]: 2026-01-12 13:52:02.835151551 +0000 UTC m=+0.054574966 container cleanup 6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 12 13:52:02 compute-0 systemd[1]: libpod-conmon-6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8.scope: Deactivated successfully.
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.870 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.874 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 podman[214313]: 2026-01-12 13:52:02.884150129 +0000 UTC m=+0.033722842 container remove 6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.888 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[44644bc8-94b3-4597-97db-f3d11e4940a5]: (4, ('Mon Jan 12 01:52:02 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2 (6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8)\n6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8\nMon Jan 12 01:52:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2 (6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8)\n6b198c2141956db5c3708a80ba1e60e410dea236d198c86bbef89388e146f9e8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.889 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[8528c5a8-3acc-4709-8b27-e031bac4ebc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.890 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap32cf8d0a-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:52:02 compute-0 kernel: tap32cf8d0a-40: left promiscuous mode
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.891 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.903 181991 INFO nova.virt.libvirt.driver [-] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Instance destroyed successfully.
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.904 181991 DEBUG nova.objects.instance [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lazy-loading 'resources' on Instance uuid b449d9a3-fefa-4191-b13a-8cf17c4292c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.905 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.907 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.907 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[3e405cd7-5859-4e9b-9c3b-cff1a6f39932]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.921 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[235b0bdf-c8fd-415a-9499-e62a830e2206]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.921 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed68da6-56a4-4406-9f5d-aaae61bd0d1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.930 181991 DEBUG nova.virt.libvirt.vif [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-12T13:51:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1179119248',display_name='tempest-TestNetworkBasicOps-server-1179119248',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1179119248',id=13,image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDuDakJSlC28NqdQfClvGdWH2uIHquw7GGQzNxmLDpJgSfvu9Kv1ha2IJC4FIKQq24Fm4Mcp8+8ichv49iWIDW3bWILECrmVam9ciCfRMc/hiW/2heaG67GoRrFUKSkGLA==',key_name='tempest-TestNetworkBasicOps-128564872',keypairs=<?>,launch_index=0,launched_at=2026-01-12T13:51:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c978298f864c4039b47e09202eaf780c',ramdisk_id='',reservation_id='r-922t7zit',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='bcf708d4-c9eb-4a4c-9503-f846d9f4a560',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-868482653',owner_user_name='tempest-TestNetworkBasicOps-868482653-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-12T13:51:41Z,user_data=None,user_id='d4158a3958504a578730a6b3561138ce',uuid=b449d9a3-fefa-4191-b13a-8cf17c4292c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.930 181991 DEBUG nova.network.os_vif_util [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converting VIF {"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.931 181991 DEBUG nova.network.os_vif_util [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:aa:d8,bridge_name='br-int',has_traffic_filtering=True,id=ef32f42e-782a-49a7-b298-f9694b804468,network=Network(32cf8d0a-4048-4450-b516-f3e1b18206b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef32f42e-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.931 181991 DEBUG os_vif [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:aa:d8,bridge_name='br-int',has_traffic_filtering=True,id=ef32f42e-782a-49a7-b298-f9694b804468,network=Network(32cf8d0a-4048-4450-b516-f3e1b18206b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef32f42e-78') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.932 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.932 181991 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef32f42e-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.933 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.933 209930 DEBUG oslo.privsep.daemon [-] privsep: reply[b13e17fa-012e-4cb5-ba22-b357a6c3a08a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 296677, 'reachable_time': 16883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214347, 'error': None, 'target': 'ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.935 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d32cf8d0a\x2d4048\x2d4450\x2db516\x2df3e1b18206b2.mount: Deactivated successfully.
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.937 181991 INFO os_vif [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:aa:d8,bridge_name='br-int',has_traffic_filtering=True,id=ef32f42e-782a-49a7-b298-f9694b804468,network=Network(32cf8d0a-4048-4450-b516-f3e1b18206b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef32f42e-78')
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.937 104723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-32cf8d0a-4048-4450-b516-f3e1b18206b2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 12 13:52:02 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:02.937 104723 DEBUG oslo.privsep.daemon [-] privsep: reply[2f818a6c-d87a-4fe0-9fa5-e6d77a5c5d66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.937 181991 INFO nova.virt.libvirt.driver [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Deleting instance files /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5_del
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.938 181991 INFO nova.virt.libvirt.driver [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Deletion of /var/lib/nova/instances/b449d9a3-fefa-4191-b13a-8cf17c4292c5_del complete
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.986 181991 INFO nova.compute.manager [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.987 181991 DEBUG oslo.service.loopingcall [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.987 181991 DEBUG nova.compute.manager [-] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 12 13:52:02 compute-0 nova_compute[181978]: 2026-01-12 13:52:02.987 181991 DEBUG nova.network.neutron [-] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 12 13:52:03 compute-0 nova_compute[181978]: 2026-01-12 13:52:03.150 181991 DEBUG nova.compute.manager [req-a6e677d4-0efe-4d54-a904-80a3b9f9c68d req-73d45012-c726-4017-839d-a8888afd120c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received event network-vif-unplugged-ef32f42e-782a-49a7-b298-f9694b804468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:52:03 compute-0 nova_compute[181978]: 2026-01-12 13:52:03.150 181991 DEBUG oslo_concurrency.lockutils [req-a6e677d4-0efe-4d54-a904-80a3b9f9c68d req-73d45012-c726-4017-839d-a8888afd120c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:52:03 compute-0 nova_compute[181978]: 2026-01-12 13:52:03.151 181991 DEBUG oslo_concurrency.lockutils [req-a6e677d4-0efe-4d54-a904-80a3b9f9c68d req-73d45012-c726-4017-839d-a8888afd120c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:52:03 compute-0 nova_compute[181978]: 2026-01-12 13:52:03.151 181991 DEBUG oslo_concurrency.lockutils [req-a6e677d4-0efe-4d54-a904-80a3b9f9c68d req-73d45012-c726-4017-839d-a8888afd120c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:52:03 compute-0 nova_compute[181978]: 2026-01-12 13:52:03.151 181991 DEBUG nova.compute.manager [req-a6e677d4-0efe-4d54-a904-80a3b9f9c68d req-73d45012-c726-4017-839d-a8888afd120c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] No waiting events found dispatching network-vif-unplugged-ef32f42e-782a-49a7-b298-f9694b804468 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:52:03 compute-0 nova_compute[181978]: 2026-01-12 13:52:03.152 181991 DEBUG nova.compute.manager [req-a6e677d4-0efe-4d54-a904-80a3b9f9c68d req-73d45012-c726-4017-839d-a8888afd120c 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received event network-vif-unplugged-ef32f42e-782a-49a7-b298-f9694b804468 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.334 181991 DEBUG nova.network.neutron [-] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.354 181991 INFO nova.compute.manager [-] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Took 1.37 seconds to deallocate network for instance.
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.395 181991 DEBUG oslo_concurrency.lockutils [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.395 181991 DEBUG oslo_concurrency.lockutils [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.445 181991 DEBUG nova.compute.provider_tree [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.456 181991 DEBUG nova.scheduler.client.report [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.472 181991 DEBUG oslo_concurrency.lockutils [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.491 181991 INFO nova.scheduler.client.report [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Deleted allocations for instance b449d9a3-fefa-4191-b13a-8cf17c4292c5
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.538 181991 DEBUG oslo_concurrency.lockutils [None req-29f2db28-4718-4897-a3a8-a6d6cc34614f d4158a3958504a578730a6b3561138ce c978298f864c4039b47e09202eaf780c - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.886s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.569 181991 DEBUG nova.network.neutron [req-3a3090e2-791a-44da-9995-2d57809853ff req-e2f80394-d03a-4e26-9189-b4b922c45812 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Updated VIF entry in instance network info cache for port ef32f42e-782a-49a7-b298-f9694b804468. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.569 181991 DEBUG nova.network.neutron [req-3a3090e2-791a-44da-9995-2d57809853ff req-e2f80394-d03a-4e26-9189-b4b922c45812 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Updating instance_info_cache with network_info: [{"id": "ef32f42e-782a-49a7-b298-f9694b804468", "address": "fa:16:3e:a9:aa:d8", "network": {"id": "32cf8d0a-4048-4450-b516-f3e1b18206b2", "bridge": "br-int", "label": "tempest-network-smoke--443522515", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c978298f864c4039b47e09202eaf780c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef32f42e-78", "ovs_interfaceid": "ef32f42e-782a-49a7-b298-f9694b804468", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 12 13:52:04 compute-0 nova_compute[181978]: 2026-01-12 13:52:04.581 181991 DEBUG oslo_concurrency.lockutils [req-3a3090e2-791a-44da-9995-2d57809853ff req-e2f80394-d03a-4e26-9189-b4b922c45812 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Releasing lock "refresh_cache-b449d9a3-fefa-4191-b13a-8cf17c4292c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 12 13:52:05 compute-0 nova_compute[181978]: 2026-01-12 13:52:05.212 181991 DEBUG nova.compute.manager [req-8b9c2c5b-1b99-432f-8a94-92e7866c611f req-4b5753f9-60e4-4e42-8ee5-06465db3af2e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received event network-vif-plugged-ef32f42e-782a-49a7-b298-f9694b804468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:52:05 compute-0 nova_compute[181978]: 2026-01-12 13:52:05.212 181991 DEBUG oslo_concurrency.lockutils [req-8b9c2c5b-1b99-432f-8a94-92e7866c611f req-4b5753f9-60e4-4e42-8ee5-06465db3af2e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Acquiring lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:52:05 compute-0 nova_compute[181978]: 2026-01-12 13:52:05.212 181991 DEBUG oslo_concurrency.lockutils [req-8b9c2c5b-1b99-432f-8a94-92e7866c611f req-4b5753f9-60e4-4e42-8ee5-06465db3af2e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:52:05 compute-0 nova_compute[181978]: 2026-01-12 13:52:05.213 181991 DEBUG oslo_concurrency.lockutils [req-8b9c2c5b-1b99-432f-8a94-92e7866c611f req-4b5753f9-60e4-4e42-8ee5-06465db3af2e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] Lock "b449d9a3-fefa-4191-b13a-8cf17c4292c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:52:05 compute-0 nova_compute[181978]: 2026-01-12 13:52:05.213 181991 DEBUG nova.compute.manager [req-8b9c2c5b-1b99-432f-8a94-92e7866c611f req-4b5753f9-60e4-4e42-8ee5-06465db3af2e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] No waiting events found dispatching network-vif-plugged-ef32f42e-782a-49a7-b298-f9694b804468 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 12 13:52:05 compute-0 nova_compute[181978]: 2026-01-12 13:52:05.213 181991 WARNING nova.compute.manager [req-8b9c2c5b-1b99-432f-8a94-92e7866c611f req-4b5753f9-60e4-4e42-8ee5-06465db3af2e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received unexpected event network-vif-plugged-ef32f42e-782a-49a7-b298-f9694b804468 for instance with vm_state deleted and task_state None.
Jan 12 13:52:05 compute-0 nova_compute[181978]: 2026-01-12 13:52:05.213 181991 DEBUG nova.compute.manager [req-8b9c2c5b-1b99-432f-8a94-92e7866c611f req-4b5753f9-60e4-4e42-8ee5-06465db3af2e 0de125f96fc34e6683b74a957e0d4953 b11c1a60f70047ed8351105d38f912dd - - default default] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Received event network-vif-deleted-ef32f42e-782a-49a7-b298-f9694b804468 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 12 13:52:07 compute-0 nova_compute[181978]: 2026-01-12 13:52:07.286 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:07 compute-0 nova_compute[181978]: 2026-01-12 13:52:07.934 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:10 compute-0 nova_compute[181978]: 2026-01-12 13:52:10.418 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:10 compute-0 nova_compute[181978]: 2026-01-12 13:52:10.490 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:10 compute-0 podman[214349]: 2026-01-12 13:52:10.543170877 +0000 UTC m=+0.037169603 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 12 13:52:12 compute-0 nova_compute[181978]: 2026-01-12 13:52:12.288 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:12 compute-0 nova_compute[181978]: 2026-01-12 13:52:12.935 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:13 compute-0 podman[214370]: 2026-01-12 13:52:13.564751562 +0000 UTC m=+0.060220393 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 12 13:52:17 compute-0 nova_compute[181978]: 2026-01-12 13:52:17.289 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:17 compute-0 nova_compute[181978]: 2026-01-12 13:52:17.903 181991 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1768225922.9018245, b449d9a3-fefa-4191-b13a-8cf17c4292c5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 12 13:52:17 compute-0 nova_compute[181978]: 2026-01-12 13:52:17.903 181991 INFO nova.compute.manager [-] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] VM Stopped (Lifecycle Event)
Jan 12 13:52:17 compute-0 nova_compute[181978]: 2026-01-12 13:52:17.923 181991 DEBUG nova.compute.manager [None req-6b6837c0-5dbf-433c-8591-134bfe187928 - - - - - -] [instance: b449d9a3-fefa-4191-b13a-8cf17c4292c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 12 13:52:17 compute-0 nova_compute[181978]: 2026-01-12 13:52:17.936 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:19 compute-0 nova_compute[181978]: 2026-01-12 13:52:19.481 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:19 compute-0 nova_compute[181978]: 2026-01-12 13:52:19.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:52:19 compute-0 nova_compute[181978]: 2026-01-12 13:52:19.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:52:19 compute-0 nova_compute[181978]: 2026-01-12 13:52:19.498 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:52:22 compute-0 nova_compute[181978]: 2026-01-12 13:52:22.291 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:22 compute-0 nova_compute[181978]: 2026-01-12 13:52:22.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:22 compute-0 podman[214388]: 2026-01-12 13:52:22.550365872 +0000 UTC m=+0.040798718 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Jan 12 13:52:22 compute-0 podman[214387]: 2026-01-12 13:52:22.576501914 +0000 UTC m=+0.068289828 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 12 13:52:22 compute-0 podman[214386]: 2026-01-12 13:52:22.578816096 +0000 UTC m=+0.073131012 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 12 13:52:22 compute-0 nova_compute[181978]: 2026-01-12 13:52:22.937 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:23 compute-0 nova_compute[181978]: 2026-01-12 13:52:23.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.501 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.501 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.502 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.502 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.693 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.693 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5776MB free_disk=73.3803596496582GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.694 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.694 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.749 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.750 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.766 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.778 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.795 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:52:24 compute-0 nova_compute[181978]: 2026-01-12 13:52:24.796 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:52:25 compute-0 nova_compute[181978]: 2026-01-12 13:52:25.796 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:26 compute-0 nova_compute[181978]: 2026-01-12 13:52:26.476 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:26 compute-0 nova_compute[181978]: 2026-01-12 13:52:26.477 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:26 compute-0 nova_compute[181978]: 2026-01-12 13:52:26.490 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:52:26 compute-0 nova_compute[181978]: 2026-01-12 13:52:26.491 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:52:27 compute-0 nova_compute[181978]: 2026-01-12 13:52:27.293 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:27 compute-0 nova_compute[181978]: 2026-01-12 13:52:27.939 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:30 compute-0 podman[214449]: 2026-01-12 13:52:30.548547824 +0000 UTC m=+0.036795349 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:52:32 compute-0 nova_compute[181978]: 2026-01-12 13:52:32.296 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:32 compute-0 nova_compute[181978]: 2026-01-12 13:52:32.941 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:37 compute-0 nova_compute[181978]: 2026-01-12 13:52:37.298 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:37 compute-0 nova_compute[181978]: 2026-01-12 13:52:37.941 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:40.206 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:52:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:40.206 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:52:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:52:40.206 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:52:40 compute-0 ovn_controller[94974]: 2026-01-12T13:52:40Z|00178|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Jan 12 13:52:41 compute-0 podman[214465]: 2026-01-12 13:52:41.54039353 +0000 UTC m=+0.035935370 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 12 13:52:42 compute-0 nova_compute[181978]: 2026-01-12 13:52:42.301 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:42 compute-0 nova_compute[181978]: 2026-01-12 13:52:42.942 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:44 compute-0 podman[214486]: 2026-01-12 13:52:44.54335129 +0000 UTC m=+0.038899036 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 12 13:52:47 compute-0 nova_compute[181978]: 2026-01-12 13:52:47.304 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:47 compute-0 nova_compute[181978]: 2026-01-12 13:52:47.944 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:52 compute-0 nova_compute[181978]: 2026-01-12 13:52:52.306 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:52 compute-0 nova_compute[181978]: 2026-01-12 13:52:52.945 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:53 compute-0 podman[214506]: 2026-01-12 13:52:53.572499722 +0000 UTC m=+0.063868903 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:52:53 compute-0 podman[214505]: 2026-01-12 13:52:53.574483012 +0000 UTC m=+0.067718553 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 12 13:52:53 compute-0 podman[214507]: 2026-01-12 13:52:53.580509787 +0000 UTC m=+0.070060769 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:52:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:52:57 compute-0 nova_compute[181978]: 2026-01-12 13:52:57.308 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:52:57 compute-0 nova_compute[181978]: 2026-01-12 13:52:57.947 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:01 compute-0 podman[214568]: 2026-01-12 13:53:01.541456244 +0000 UTC m=+0.037159784 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 12 13:53:02 compute-0 nova_compute[181978]: 2026-01-12 13:53:02.311 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:02 compute-0 nova_compute[181978]: 2026-01-12 13:53:02.948 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:07 compute-0 nova_compute[181978]: 2026-01-12 13:53:07.313 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:07 compute-0 nova_compute[181978]: 2026-01-12 13:53:07.950 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:12 compute-0 nova_compute[181978]: 2026-01-12 13:53:12.314 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:12 compute-0 podman[214584]: 2026-01-12 13:53:12.539488134 +0000 UTC m=+0.033850122 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 12 13:53:12 compute-0 nova_compute[181978]: 2026-01-12 13:53:12.950 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:15 compute-0 podman[214605]: 2026-01-12 13:53:15.538418139 +0000 UTC m=+0.033940931 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 12 13:53:17 compute-0 nova_compute[181978]: 2026-01-12 13:53:17.317 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:17 compute-0 nova_compute[181978]: 2026-01-12 13:53:17.951 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:20 compute-0 nova_compute[181978]: 2026-01-12 13:53:20.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:53:20 compute-0 nova_compute[181978]: 2026-01-12 13:53:20.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:53:20 compute-0 nova_compute[181978]: 2026-01-12 13:53:20.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:53:20 compute-0 nova_compute[181978]: 2026-01-12 13:53:20.495 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:53:22 compute-0 nova_compute[181978]: 2026-01-12 13:53:22.318 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:22 compute-0 nova_compute[181978]: 2026-01-12 13:53:22.953 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:23 compute-0 nova_compute[181978]: 2026-01-12 13:53:23.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.500 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.500 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.500 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.500 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:53:24 compute-0 podman[214623]: 2026-01-12 13:53:24.568797937 +0000 UTC m=+0.056632003 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:53:24 compute-0 podman[214624]: 2026-01-12 13:53:24.591189367 +0000 UTC m=+0.077878790 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter)
Jan 12 13:53:24 compute-0 podman[214622]: 2026-01-12 13:53:24.595518288 +0000 UTC m=+0.086442637 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.717 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.718 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5775MB free_disk=73.3803596496582GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.718 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.718 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.769 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.770 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.787 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.799 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.800 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:53:24 compute-0 nova_compute[181978]: 2026-01-12 13:53:24.801 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:53:25 compute-0 nova_compute[181978]: 2026-01-12 13:53:25.801 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:53:25 compute-0 nova_compute[181978]: 2026-01-12 13:53:25.801 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:53:26 compute-0 nova_compute[181978]: 2026-01-12 13:53:26.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:53:26 compute-0 nova_compute[181978]: 2026-01-12 13:53:26.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:53:27 compute-0 nova_compute[181978]: 2026-01-12 13:53:27.318 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:27 compute-0 nova_compute[181978]: 2026-01-12 13:53:27.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:53:27 compute-0 nova_compute[181978]: 2026-01-12 13:53:27.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:53:27 compute-0 nova_compute[181978]: 2026-01-12 13:53:27.954 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:28 compute-0 nova_compute[181978]: 2026-01-12 13:53:28.476 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:53:32 compute-0 nova_compute[181978]: 2026-01-12 13:53:32.320 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:32 compute-0 podman[214686]: 2026-01-12 13:53:32.54252486 +0000 UTC m=+0.037564094 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 12 13:53:32 compute-0 nova_compute[181978]: 2026-01-12 13:53:32.954 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:37 compute-0 nova_compute[181978]: 2026-01-12 13:53:37.322 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:37 compute-0 nova_compute[181978]: 2026-01-12 13:53:37.956 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:53:40.207 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:53:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:53:40.207 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:53:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:53:40.207 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:53:42 compute-0 nova_compute[181978]: 2026-01-12 13:53:42.324 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:42 compute-0 sshd-session[214702]: Accepted publickey for zuul from 192.168.122.10 port 58722 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:53:42 compute-0 systemd-logind[775]: New session 26 of user zuul.
Jan 12 13:53:42 compute-0 systemd[1]: Started Session 26 of User zuul.
Jan 12 13:53:42 compute-0 sshd-session[214702]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:53:42 compute-0 podman[214704]: 2026-01-12 13:53:42.635291387 +0000 UTC m=+0.038822261 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:53:42 compute-0 sudo[214727]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 12 13:53:42 compute-0 sudo[214727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:53:42 compute-0 nova_compute[181978]: 2026-01-12 13:53:42.957 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:46 compute-0 ovs-vsctl[214887]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 12 13:53:46 compute-0 podman[214923]: 2026-01-12 13:53:46.549537708 +0000 UTC m=+0.045887368 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:53:46 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 214751 (sos)
Jan 12 13:53:46 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 12 13:53:46 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 12 13:53:46 compute-0 virtqemud[153584]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 12 13:53:46 compute-0 virtqemud[153584]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 12 13:53:46 compute-0 virtqemud[153584]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 12 13:53:47 compute-0 nova_compute[181978]: 2026-01-12 13:53:47.324 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:47 compute-0 crontab[215287]: (root) LIST (root)
Jan 12 13:53:47 compute-0 nova_compute[181978]: 2026-01-12 13:53:47.957 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:49 compute-0 systemd[1]: Starting Hostname Service...
Jan 12 13:53:49 compute-0 systemd[1]: Started Hostname Service.
Jan 12 13:53:52 compute-0 nova_compute[181978]: 2026-01-12 13:53:52.325 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:52 compute-0 nova_compute[181978]: 2026-01-12 13:53:52.958 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:55 compute-0 podman[216689]: 2026-01-12 13:53:55.316520637 +0000 UTC m=+0.065036199 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6)
Jan 12 13:53:55 compute-0 podman[216687]: 2026-01-12 13:53:55.342395126 +0000 UTC m=+0.089542277 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:53:55 compute-0 podman[216686]: 2026-01-12 13:53:55.364806414 +0000 UTC m=+0.117517918 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 12 13:53:55 compute-0 ovs-appctl[216812]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 12 13:53:55 compute-0 ovs-appctl[216823]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 12 13:53:55 compute-0 ovs-appctl[216828]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 12 13:53:57 compute-0 nova_compute[181978]: 2026-01-12 13:53:57.326 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:53:57 compute-0 nova_compute[181978]: 2026-01-12 13:53:57.959 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:00 compute-0 virtqemud[153584]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 12 13:54:02 compute-0 systemd[1]: Starting Time & Date Service...
Jan 12 13:54:02 compute-0 nova_compute[181978]: 2026-01-12 13:54:02.326 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:02 compute-0 systemd[1]: Started Time & Date Service.
Jan 12 13:54:02 compute-0 podman[218224]: 2026-01-12 13:54:02.621948921 +0000 UTC m=+0.054906548 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:54:02 compute-0 nova_compute[181978]: 2026-01-12 13:54:02.961 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:07 compute-0 nova_compute[181978]: 2026-01-12 13:54:07.327 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:07 compute-0 nova_compute[181978]: 2026-01-12 13:54:07.961 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:12 compute-0 nova_compute[181978]: 2026-01-12 13:54:12.330 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:12 compute-0 nova_compute[181978]: 2026-01-12 13:54:12.962 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:13 compute-0 podman[218274]: 2026-01-12 13:54:13.546757961 +0000 UTC m=+0.038678710 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 12 13:54:17 compute-0 nova_compute[181978]: 2026-01-12 13:54:17.332 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:17 compute-0 podman[218296]: 2026-01-12 13:54:17.546345757 +0000 UTC m=+0.041426666 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:54:17 compute-0 nova_compute[181978]: 2026-01-12 13:54:17.963 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:18 compute-0 sudo[214727]: pam_unix(sudo:session): session closed for user root
Jan 12 13:54:18 compute-0 sshd-session[214713]: Received disconnect from 192.168.122.10 port 58722:11: disconnected by user
Jan 12 13:54:18 compute-0 sshd-session[214713]: Disconnected from user zuul 192.168.122.10 port 58722
Jan 12 13:54:18 compute-0 sshd-session[214702]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:54:18 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Jan 12 13:54:18 compute-0 systemd[1]: session-26.scope: Consumed 55.643s CPU time, 476.8M memory peak, read 103.0M from disk, written 33.4M to disk.
Jan 12 13:54:18 compute-0 systemd-logind[775]: Session 26 logged out. Waiting for processes to exit.
Jan 12 13:54:18 compute-0 systemd-logind[775]: Removed session 26.
Jan 12 13:54:18 compute-0 sshd-session[218314]: Accepted publickey for zuul from 192.168.122.10 port 35906 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:54:18 compute-0 systemd-logind[775]: New session 27 of user zuul.
Jan 12 13:54:18 compute-0 systemd[1]: Started Session 27 of User zuul.
Jan 12 13:54:18 compute-0 sshd-session[218314]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:54:18 compute-0 sudo[218318]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2026-01-12-sblumsf.tar.xz
Jan 12 13:54:18 compute-0 sudo[218318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:54:18 compute-0 sudo[218318]: pam_unix(sudo:session): session closed for user root
Jan 12 13:54:18 compute-0 sshd-session[218317]: Received disconnect from 192.168.122.10 port 35906:11: disconnected by user
Jan 12 13:54:18 compute-0 sshd-session[218317]: Disconnected from user zuul 192.168.122.10 port 35906
Jan 12 13:54:18 compute-0 sshd-session[218314]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:54:18 compute-0 systemd-logind[775]: Session 27 logged out. Waiting for processes to exit.
Jan 12 13:54:18 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Jan 12 13:54:18 compute-0 systemd-logind[775]: Removed session 27.
Jan 12 13:54:18 compute-0 sshd-session[218343]: Accepted publickey for zuul from 192.168.122.10 port 35922 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:54:18 compute-0 systemd-logind[775]: New session 28 of user zuul.
Jan 12 13:54:19 compute-0 systemd[1]: Started Session 28 of User zuul.
Jan 12 13:54:19 compute-0 sshd-session[218343]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:54:19 compute-0 sudo[218347]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 12 13:54:19 compute-0 sudo[218347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:54:19 compute-0 sudo[218347]: pam_unix(sudo:session): session closed for user root
Jan 12 13:54:19 compute-0 sshd-session[218346]: Received disconnect from 192.168.122.10 port 35922:11: disconnected by user
Jan 12 13:54:19 compute-0 sshd-session[218346]: Disconnected from user zuul 192.168.122.10 port 35922
Jan 12 13:54:19 compute-0 sshd-session[218343]: pam_unix(sshd:session): session closed for user zuul
Jan 12 13:54:19 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Jan 12 13:54:19 compute-0 systemd-logind[775]: Session 28 logged out. Waiting for processes to exit.
Jan 12 13:54:19 compute-0 systemd-logind[775]: Removed session 28.
Jan 12 13:54:22 compute-0 nova_compute[181978]: 2026-01-12 13:54:22.332 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:22 compute-0 nova_compute[181978]: 2026-01-12 13:54:22.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:22 compute-0 nova_compute[181978]: 2026-01-12 13:54:22.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:54:22 compute-0 nova_compute[181978]: 2026-01-12 13:54:22.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:54:22 compute-0 nova_compute[181978]: 2026-01-12 13:54:22.495 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:54:22 compute-0 nova_compute[181978]: 2026-01-12 13:54:22.964 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.481 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.501 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.501 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.501 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.501 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.682 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.682 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5503MB free_disk=73.3797607421875GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.683 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.683 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.726 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.726 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.738 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Refreshing inventories for resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.750 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Updating ProviderTree inventory for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.751 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Updating inventory in ProviderTree for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.760 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Refreshing aggregate associations for resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.774 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Refreshing trait associations for resource provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AVX512VPCLMULQDQ,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,HW_CPU_X86_AVX512VAES,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.787 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.795 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.796 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:54:24 compute-0 nova_compute[181978]: 2026-01-12 13:54:24.796 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:54:25 compute-0 podman[218373]: 2026-01-12 13:54:25.553662609 +0000 UTC m=+0.046354090 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:54:25 compute-0 podman[218374]: 2026-01-12 13:54:25.565196292 +0000 UTC m=+0.055098780 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, container_name=openstack_network_exporter)
Jan 12 13:54:25 compute-0 podman[218372]: 2026-01-12 13:54:25.571790349 +0000 UTC m=+0.064420766 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 12 13:54:26 compute-0 nova_compute[181978]: 2026-01-12 13:54:26.795 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:26 compute-0 nova_compute[181978]: 2026-01-12 13:54:26.796 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:27 compute-0 nova_compute[181978]: 2026-01-12 13:54:27.333 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:27 compute-0 nova_compute[181978]: 2026-01-12 13:54:27.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:27 compute-0 nova_compute[181978]: 2026-01-12 13:54:27.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:27 compute-0 nova_compute[181978]: 2026-01-12 13:54:27.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:27 compute-0 nova_compute[181978]: 2026-01-12 13:54:27.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:54:27 compute-0 nova_compute[181978]: 2026-01-12 13:54:27.965 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:28 compute-0 nova_compute[181978]: 2026-01-12 13:54:28.476 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:31 compute-0 nova_compute[181978]: 2026-01-12 13:54:31.475 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:54:32 compute-0 nova_compute[181978]: 2026-01-12 13:54:32.335 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:32 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 12 13:54:32 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 12 13:54:32 compute-0 nova_compute[181978]: 2026-01-12 13:54:32.967 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:33 compute-0 podman[218440]: 2026-01-12 13:54:33.539343141 +0000 UTC m=+0.034491348 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 12 13:54:37 compute-0 nova_compute[181978]: 2026-01-12 13:54:37.337 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:37 compute-0 nova_compute[181978]: 2026-01-12 13:54:37.967 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:54:40.208 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:54:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:54:40.209 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:54:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:54:40.209 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:54:42 compute-0 nova_compute[181978]: 2026-01-12 13:54:42.338 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:42 compute-0 nova_compute[181978]: 2026-01-12 13:54:42.968 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:44 compute-0 podman[218456]: 2026-01-12 13:54:44.539623504 +0000 UTC m=+0.035189912 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 12 13:54:47 compute-0 nova_compute[181978]: 2026-01-12 13:54:47.340 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:47 compute-0 podman[218478]: 2026-01-12 13:54:47.819503329 +0000 UTC m=+0.066439422 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 12 13:54:47 compute-0 nova_compute[181978]: 2026-01-12 13:54:47.970 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:52 compute-0 nova_compute[181978]: 2026-01-12 13:54:52.340 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:52 compute-0 nova_compute[181978]: 2026-01-12 13:54:52.971 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:54:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:54:56 compute-0 podman[218497]: 2026-01-12 13:54:56.561017982 +0000 UTC m=+0.049052171 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter)
Jan 12 13:54:56 compute-0 podman[218496]: 2026-01-12 13:54:56.576408838 +0000 UTC m=+0.068412141 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:54:56 compute-0 podman[218495]: 2026-01-12 13:54:56.60380837 +0000 UTC m=+0.097317876 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 12 13:54:57 compute-0 nova_compute[181978]: 2026-01-12 13:54:57.342 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:54:57 compute-0 nova_compute[181978]: 2026-01-12 13:54:57.972 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:02 compute-0 nova_compute[181978]: 2026-01-12 13:55:02.343 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:02 compute-0 nova_compute[181978]: 2026-01-12 13:55:02.973 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:04 compute-0 podman[218557]: 2026-01-12 13:55:04.542445086 +0000 UTC m=+0.036691193 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 12 13:55:07 compute-0 nova_compute[181978]: 2026-01-12 13:55:07.345 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:07 compute-0 nova_compute[181978]: 2026-01-12 13:55:07.975 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:12 compute-0 nova_compute[181978]: 2026-01-12 13:55:12.348 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:12 compute-0 nova_compute[181978]: 2026-01-12 13:55:12.975 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:15 compute-0 podman[218573]: 2026-01-12 13:55:15.54137448 +0000 UTC m=+0.035688929 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:55:17 compute-0 nova_compute[181978]: 2026-01-12 13:55:17.351 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:17 compute-0 nova_compute[181978]: 2026-01-12 13:55:17.976 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:18 compute-0 nova_compute[181978]: 2026-01-12 13:55:18.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:18 compute-0 podman[218594]: 2026-01-12 13:55:18.541418043 +0000 UTC m=+0.037915455 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:55:22 compute-0 nova_compute[181978]: 2026-01-12 13:55:22.353 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:22 compute-0 nova_compute[181978]: 2026-01-12 13:55:22.977 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:23 compute-0 nova_compute[181978]: 2026-01-12 13:55:23.494 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:23 compute-0 nova_compute[181978]: 2026-01-12 13:55:23.495 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 12 13:55:23 compute-0 nova_compute[181978]: 2026-01-12 13:55:23.515 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 12 13:55:24 compute-0 nova_compute[181978]: 2026-01-12 13:55:24.501 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:24 compute-0 nova_compute[181978]: 2026-01-12 13:55:24.501 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:55:24 compute-0 nova_compute[181978]: 2026-01-12 13:55:24.501 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:55:24 compute-0 nova_compute[181978]: 2026-01-12 13:55:24.516 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:55:25 compute-0 nova_compute[181978]: 2026-01-12 13:55:25.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.516 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.517 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.517 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.517 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.698 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.699 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5627MB free_disk=73.38005447387695GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.699 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:55:26 compute-0 nova_compute[181978]: 2026-01-12 13:55:26.699 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.002 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.002 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.021 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.042 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.044 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.044 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.044 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.044 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.355 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:27 compute-0 podman[218612]: 2026-01-12 13:55:27.551746094 +0000 UTC m=+0.044992039 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:55:27 compute-0 podman[218613]: 2026-01-12 13:55:27.560794446 +0000 UTC m=+0.047890698 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64)
Jan 12 13:55:27 compute-0 podman[218611]: 2026-01-12 13:55:27.572441443 +0000 UTC m=+0.066925856 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 12 13:55:27 compute-0 nova_compute[181978]: 2026-01-12 13:55:27.978 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:28 compute-0 nova_compute[181978]: 2026-01-12 13:55:28.055 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:28 compute-0 nova_compute[181978]: 2026-01-12 13:55:28.055 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:28 compute-0 nova_compute[181978]: 2026-01-12 13:55:28.055 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:28 compute-0 nova_compute[181978]: 2026-01-12 13:55:28.055 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:28 compute-0 nova_compute[181978]: 2026-01-12 13:55:28.055 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:55:29 compute-0 nova_compute[181978]: 2026-01-12 13:55:29.476 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:55:32 compute-0 nova_compute[181978]: 2026-01-12 13:55:32.356 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:32 compute-0 nova_compute[181978]: 2026-01-12 13:55:32.979 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:35 compute-0 podman[218673]: 2026-01-12 13:55:35.535351325 +0000 UTC m=+0.030160075 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:55:37 compute-0 nova_compute[181978]: 2026-01-12 13:55:37.358 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:37 compute-0 nova_compute[181978]: 2026-01-12 13:55:37.980 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:55:40.209 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:55:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:55:40.210 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:55:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:55:40.210 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:55:42 compute-0 nova_compute[181978]: 2026-01-12 13:55:42.360 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:42 compute-0 nova_compute[181978]: 2026-01-12 13:55:42.981 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:46 compute-0 podman[218689]: 2026-01-12 13:55:46.541408969 +0000 UTC m=+0.037045368 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 12 13:55:47 compute-0 nova_compute[181978]: 2026-01-12 13:55:47.360 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:47 compute-0 nova_compute[181978]: 2026-01-12 13:55:47.982 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:49 compute-0 podman[218710]: 2026-01-12 13:55:49.540543584 +0000 UTC m=+0.036373536 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 12 13:55:52 compute-0 nova_compute[181978]: 2026-01-12 13:55:52.362 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:52 compute-0 nova_compute[181978]: 2026-01-12 13:55:52.984 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:57 compute-0 nova_compute[181978]: 2026-01-12 13:55:57.364 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:57 compute-0 nova_compute[181978]: 2026-01-12 13:55:57.984 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:55:58 compute-0 podman[218730]: 2026-01-12 13:55:58.552349644 +0000 UTC m=+0.040730297 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Jan 12 13:55:58 compute-0 podman[218729]: 2026-01-12 13:55:58.563404658 +0000 UTC m=+0.053433059 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:55:58 compute-0 podman[218728]: 2026-01-12 13:55:58.570410309 +0000 UTC m=+0.063035512 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 12 13:56:02 compute-0 nova_compute[181978]: 2026-01-12 13:56:02.365 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:02 compute-0 nova_compute[181978]: 2026-01-12 13:56:02.985 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:06 compute-0 podman[218789]: 2026-01-12 13:56:06.543486565 +0000 UTC m=+0.038995636 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 12 13:56:07 compute-0 nova_compute[181978]: 2026-01-12 13:56:07.367 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:07 compute-0 nova_compute[181978]: 2026-01-12 13:56:07.986 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:11 compute-0 nova_compute[181978]: 2026-01-12 13:56:11.608 181991 DEBUG oslo_concurrency.processutils [None req-f4f0a209-b505-45da-a937-8c2d6db405d6 3d4b1903210c4c1899fab6cc43d475d1 07608c920d5845209073c2b943d2a58b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 12 13:56:11 compute-0 nova_compute[181978]: 2026-01-12 13:56:11.621 181991 DEBUG oslo_concurrency.processutils [None req-f4f0a209-b505-45da-a937-8c2d6db405d6 3d4b1903210c4c1899fab6cc43d475d1 07608c920d5845209073c2b943d2a58b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 12 13:56:12 compute-0 nova_compute[181978]: 2026-01-12 13:56:12.368 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:12 compute-0 nova_compute[181978]: 2026-01-12 13:56:12.987 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:16 compute-0 nova_compute[181978]: 2026-01-12 13:56:16.652 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:56:16.653 104189 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:a1:a4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '0e:9f:24:bd:cd:65'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 12 13:56:16 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:56:16.653 104189 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 12 13:56:17 compute-0 nova_compute[181978]: 2026-01-12 13:56:17.369 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:17 compute-0 podman[218807]: 2026-01-12 13:56:17.540415864 +0000 UTC m=+0.036312150 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:56:17 compute-0 nova_compute[181978]: 2026-01-12 13:56:17.989 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:20 compute-0 podman[218828]: 2026-01-12 13:56:20.544520264 +0000 UTC m=+0.039450160 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 12 13:56:22 compute-0 nova_compute[181978]: 2026-01-12 13:56:22.371 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:22 compute-0 nova_compute[181978]: 2026-01-12 13:56:22.990 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:23 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:56:23.655 104189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c2d4250-79a9-4504-9090-d7395fcb2080, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 12 13:56:26 compute-0 nova_compute[181978]: 2026-01-12 13:56:26.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:26 compute-0 nova_compute[181978]: 2026-01-12 13:56:26.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:56:26 compute-0 nova_compute[181978]: 2026-01-12 13:56:26.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:56:26 compute-0 nova_compute[181978]: 2026-01-12 13:56:26.494 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:56:26 compute-0 nova_compute[181978]: 2026-01-12 13:56:26.495 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.372 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.506 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.506 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.506 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.506 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.686 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.687 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5670MB free_disk=73.38008499145508GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.687 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.687 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.786 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.786 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.802 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.811 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.812 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.812 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:56:27 compute-0 nova_compute[181978]: 2026-01-12 13:56:27.991 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:28 compute-0 nova_compute[181978]: 2026-01-12 13:56:28.811 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:28 compute-0 nova_compute[181978]: 2026-01-12 13:56:28.812 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:28 compute-0 nova_compute[181978]: 2026-01-12 13:56:28.812 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:56:29 compute-0 nova_compute[181978]: 2026-01-12 13:56:29.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:29 compute-0 podman[218847]: 2026-01-12 13:56:29.557289823 +0000 UTC m=+0.047133824 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git)
Jan 12 13:56:29 compute-0 podman[218845]: 2026-01-12 13:56:29.578161994 +0000 UTC m=+0.072734044 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 12 13:56:29 compute-0 podman[218846]: 2026-01-12 13:56:29.578803831 +0000 UTC m=+0.070449159 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:56:31 compute-0 nova_compute[181978]: 2026-01-12 13:56:31.475 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:31 compute-0 nova_compute[181978]: 2026-01-12 13:56:31.475 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:56:32 compute-0 nova_compute[181978]: 2026-01-12 13:56:32.374 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:32 compute-0 nova_compute[181978]: 2026-01-12 13:56:32.992 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:37 compute-0 nova_compute[181978]: 2026-01-12 13:56:37.376 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:37 compute-0 podman[218907]: 2026-01-12 13:56:37.541325677 +0000 UTC m=+0.036202246 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 12 13:56:37 compute-0 nova_compute[181978]: 2026-01-12 13:56:37.993 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:56:40.210 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:56:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:56:40.210 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:56:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:56:40.210 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:56:42 compute-0 nova_compute[181978]: 2026-01-12 13:56:42.379 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:42 compute-0 nova_compute[181978]: 2026-01-12 13:56:42.993 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:47 compute-0 nova_compute[181978]: 2026-01-12 13:56:47.380 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:47 compute-0 nova_compute[181978]: 2026-01-12 13:56:47.994 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:48 compute-0 podman[218923]: 2026-01-12 13:56:48.563381083 +0000 UTC m=+0.059293646 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 12 13:56:51 compute-0 podman[218944]: 2026-01-12 13:56:51.545368902 +0000 UTC m=+0.040349470 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 12 13:56:52 compute-0 nova_compute[181978]: 2026-01-12 13:56:52.382 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:52 compute-0 nova_compute[181978]: 2026-01-12 13:56:52.996 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.123 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:55 compute-0 ceilometer_agent_compute[191615]: 2026-01-12 13:56:55.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 12 13:56:57 compute-0 nova_compute[181978]: 2026-01-12 13:56:57.383 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:56:57 compute-0 nova_compute[181978]: 2026-01-12 13:56:57.996 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:00 compute-0 podman[218962]: 2026-01-12 13:57:00.547770379 +0000 UTC m=+0.041006466 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 12 13:57:00 compute-0 podman[218963]: 2026-01-12 13:57:00.55347848 +0000 UTC m=+0.043343359 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 12 13:57:00 compute-0 podman[218961]: 2026-01-12 13:57:00.56608957 +0000 UTC m=+0.061292486 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 12 13:57:02 compute-0 nova_compute[181978]: 2026-01-12 13:57:02.384 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:02 compute-0 nova_compute[181978]: 2026-01-12 13:57:02.998 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:07 compute-0 nova_compute[181978]: 2026-01-12 13:57:07.387 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:07 compute-0 nova_compute[181978]: 2026-01-12 13:57:07.999 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:08 compute-0 podman[219024]: 2026-01-12 13:57:08.569551164 +0000 UTC m=+0.062587029 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 12 13:57:12 compute-0 nova_compute[181978]: 2026-01-12 13:57:12.389 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:13 compute-0 nova_compute[181978]: 2026-01-12 13:57:13.000 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:17 compute-0 nova_compute[181978]: 2026-01-12 13:57:17.391 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:18 compute-0 nova_compute[181978]: 2026-01-12 13:57:18.001 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:19 compute-0 podman[219041]: 2026-01-12 13:57:19.54151268 +0000 UTC m=+0.037727482 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 12 13:57:22 compute-0 nova_compute[181978]: 2026-01-12 13:57:22.393 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:22 compute-0 podman[219062]: 2026-01-12 13:57:22.545595199 +0000 UTC m=+0.040792603 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 12 13:57:23 compute-0 nova_compute[181978]: 2026-01-12 13:57:23.002 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:26 compute-0 nova_compute[181978]: 2026-01-12 13:57:26.479 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:57:27 compute-0 nova_compute[181978]: 2026-01-12 13:57:27.395 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.002 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.480 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.492 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.493 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.493 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.493 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.493 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.509 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.510 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.510 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.510 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.716 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.717 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.38008499145508GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.717 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.717 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.819 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.819 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.834 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.846 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.847 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:57:28 compute-0 nova_compute[181978]: 2026-01-12 13:57:28.847 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:57:29 compute-0 nova_compute[181978]: 2026-01-12 13:57:29.834 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:57:29 compute-0 nova_compute[181978]: 2026-01-12 13:57:29.834 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:57:29 compute-0 nova_compute[181978]: 2026-01-12 13:57:29.834 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:57:31 compute-0 nova_compute[181978]: 2026-01-12 13:57:31.476 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:57:31 compute-0 podman[219079]: 2026-01-12 13:57:31.568442493 +0000 UTC m=+0.063306541 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 12 13:57:31 compute-0 podman[219080]: 2026-01-12 13:57:31.57444093 +0000 UTC m=+0.067323763 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 12 13:57:31 compute-0 podman[219081]: 2026-01-12 13:57:31.586447564 +0000 UTC m=+0.076163021 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 12 13:57:32 compute-0 nova_compute[181978]: 2026-01-12 13:57:32.396 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:33 compute-0 nova_compute[181978]: 2026-01-12 13:57:33.004 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:37 compute-0 nova_compute[181978]: 2026-01-12 13:57:37.399 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:38 compute-0 nova_compute[181978]: 2026-01-12 13:57:38.005 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:39 compute-0 podman[219142]: 2026-01-12 13:57:39.536338724 +0000 UTC m=+0.034109789 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 12 13:57:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:57:40.210 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:57:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:57:40.211 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:57:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:57:40.211 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:57:42 compute-0 nova_compute[181978]: 2026-01-12 13:57:42.400 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:43 compute-0 nova_compute[181978]: 2026-01-12 13:57:43.005 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:47 compute-0 nova_compute[181978]: 2026-01-12 13:57:47.402 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:48 compute-0 nova_compute[181978]: 2026-01-12 13:57:48.006 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:50 compute-0 podman[219158]: 2026-01-12 13:57:50.552441378 +0000 UTC m=+0.034725457 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 12 13:57:52 compute-0 nova_compute[181978]: 2026-01-12 13:57:52.404 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:53 compute-0 nova_compute[181978]: 2026-01-12 13:57:53.007 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:53 compute-0 podman[219180]: 2026-01-12 13:57:53.543357153 +0000 UTC m=+0.039575025 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 12 13:57:57 compute-0 nova_compute[181978]: 2026-01-12 13:57:57.404 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:57:58 compute-0 nova_compute[181978]: 2026-01-12 13:57:58.008 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:02 compute-0 nova_compute[181978]: 2026-01-12 13:58:02.406 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:02 compute-0 podman[219199]: 2026-01-12 13:58:02.54904705 +0000 UTC m=+0.040891378 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Jan 12 13:58:02 compute-0 podman[219197]: 2026-01-12 13:58:02.566336826 +0000 UTC m=+0.061603319 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 12 13:58:02 compute-0 podman[219198]: 2026-01-12 13:58:02.578396948 +0000 UTC m=+0.072161068 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:58:03 compute-0 nova_compute[181978]: 2026-01-12 13:58:03.009 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:07 compute-0 nova_compute[181978]: 2026-01-12 13:58:07.408 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:08 compute-0 nova_compute[181978]: 2026-01-12 13:58:08.010 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:10 compute-0 podman[219260]: 2026-01-12 13:58:10.538638999 +0000 UTC m=+0.034495616 container health_status 58e57f9a8d3324bb421e82ba5d09a572914202dce8474c33247c040ec79e0184 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 12 13:58:12 compute-0 nova_compute[181978]: 2026-01-12 13:58:12.409 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:13 compute-0 nova_compute[181978]: 2026-01-12 13:58:13.011 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:17 compute-0 nova_compute[181978]: 2026-01-12 13:58:17.411 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:18 compute-0 nova_compute[181978]: 2026-01-12 13:58:18.011 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:21 compute-0 podman[219277]: 2026-01-12 13:58:21.542429084 +0000 UTC m=+0.036247078 container health_status 82295cd1a8cb32e1b1ad46e0efd0b4156eef47569150ef2c8f061f45ef2495aa (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 12 13:58:22 compute-0 nova_compute[181978]: 2026-01-12 13:58:22.413 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:23 compute-0 nova_compute[181978]: 2026-01-12 13:58:23.012 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:24 compute-0 podman[219298]: 2026-01-12 13:58:24.545530863 +0000 UTC m=+0.041260454 container health_status 6587634031b70456869df97ec914a0a2fa7e1191720d90d2f2c71947c304fd1c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 12 13:58:26 compute-0 nova_compute[181978]: 2026-01-12 13:58:26.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:27 compute-0 nova_compute[181978]: 2026-01-12 13:58:27.415 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.014 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.502 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.502 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.503 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.503 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.679 181991 WARNING nova.virt.libvirt.driver [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.680 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5724MB free_disk=73.38008499145508GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.680 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.680 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.727 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.727 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.749 181991 DEBUG nova.compute.provider_tree [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.761 181991 DEBUG nova.scheduler.client.report [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Inventory has not changed for provider 5f3fe3a8-f640-4221-8f9a-71aa07eebe17 based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.762 181991 DEBUG nova.compute.resource_tracker [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 12 13:58:28 compute-0 nova_compute[181978]: 2026-01-12 13:58:28.762 181991 DEBUG oslo_concurrency.lockutils [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 12 13:58:29 compute-0 nova_compute[181978]: 2026-01-12 13:58:29.761 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:29 compute-0 nova_compute[181978]: 2026-01-12 13:58:29.761 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:29 compute-0 nova_compute[181978]: 2026-01-12 13:58:29.761 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:29 compute-0 nova_compute[181978]: 2026-01-12 13:58:29.762 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 12 13:58:29 compute-0 sshd-session[219315]: Accepted publickey for zuul from 192.168.122.10 port 42186 ssh2: ECDSA SHA256:J6jdNF9u414OgDq+AauZa6hEV52MoCdxHYwLrSqkG1k
Jan 12 13:58:29 compute-0 systemd-logind[775]: New session 29 of user zuul.
Jan 12 13:58:29 compute-0 systemd[1]: Started Session 29 of User zuul.
Jan 12 13:58:29 compute-0 sshd-session[219315]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 12 13:58:30 compute-0 sudo[219319]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 12 13:58:30 compute-0 sudo[219319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 12 13:58:30 compute-0 nova_compute[181978]: 2026-01-12 13:58:30.480 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:30 compute-0 nova_compute[181978]: 2026-01-12 13:58:30.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 12 13:58:30 compute-0 nova_compute[181978]: 2026-01-12 13:58:30.481 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 12 13:58:30 compute-0 nova_compute[181978]: 2026-01-12 13:58:30.494 181991 DEBUG nova.compute.manager [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 12 13:58:30 compute-0 nova_compute[181978]: 2026-01-12 13:58:30.494 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:32 compute-0 nova_compute[181978]: 2026-01-12 13:58:32.417 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:33 compute-0 nova_compute[181978]: 2026-01-12 13:58:33.014 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:33 compute-0 ovs-vsctl[219479]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 12 13:58:33 compute-0 nova_compute[181978]: 2026-01-12 13:58:33.489 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:33 compute-0 nova_compute[181978]: 2026-01-12 13:58:33.489 181991 DEBUG oslo_service.periodic_task [None req-ded28b3a-162e-404f-96ad-ca089eae10e8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 12 13:58:33 compute-0 podman[219487]: 2026-01-12 13:58:33.565416607 +0000 UTC m=+0.057500115 container health_status c65de6a108e3b227d0619783fad2a0aad6ad7ffe4f79175ad3b4d234a8f3e5f1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 12 13:58:33 compute-0 podman[219484]: 2026-01-12 13:58:33.58463895 +0000 UTC m=+0.076693084 container health_status 317c4ca2a63f06d063b87f28fdca790474f7e768e751d039ec05dd0432c7696c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '849b5d9ea6ecfbfa83a43223224e5495df9910e7b3dd917beabba6e23b2277f2-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 12 13:58:33 compute-0 podman[219489]: 2026-01-12 13:58:33.596382312 +0000 UTC m=+0.084804913 container health_status e60b40cea6543bd75c75718a7e0316760dc0acf857bf2f5b84e3675f287f1bbd (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '4082ced30e6ad049df6a8d797636cced1c5007dd3b8d3c0f8e41b645dcf94e4f-bfe64a9be4ad33b711c387c52062c246b4ab570f953402ac6b4a5261a3dbcbc6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, config_id=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Jan 12 13:58:34 compute-0 virtqemud[153584]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 12 13:58:34 compute-0 virtqemud[153584]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 12 13:58:34 compute-0 virtqemud[153584]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 12 13:58:34 compute-0 crontab[219919]: (root) LIST (root)
Jan 12 13:58:36 compute-0 systemd[1]: Starting Hostname Service...
Jan 12 13:58:37 compute-0 systemd[1]: Started Hostname Service.
Jan 12 13:58:37 compute-0 nova_compute[181978]: 2026-01-12 13:58:37.418 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:38 compute-0 nova_compute[181978]: 2026-01-12 13:58:38.016 181991 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 12 13:58:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:58:40.211 104189 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 12 13:58:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:58:40.211 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 12 13:58:40 compute-0 ovn_metadata_agent[104184]: 2026-01-12 13:58:40.211 104189 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
